@sophiaarichter@floatingtim#IndieWeb also has some useful advice and help in these areas. They use things like POSSE, backfeed, and webmentions that allow site to site interaction that helps to solve the "bring folks to our sites" problem. https://indieweb.org/
{
"type": "entry",
"published": "2018-04-07T05:40:53+00:00",
"url": "http://stream.boffosocko.com/2018/sophiaarichter-floatingtim-indieweb-also-has-some-useful-advice-and-help",
"category": [
"IndieWeb"
],
"syndication": [
"https://twitter.com/ChrisAldrich/status/982493408852836353"
],
"in-reply-to": [
"https://twitter.com/floatingtim/status/981592536593641473"
],
"content": {
"text": "@sophiaarichter @floatingtim #IndieWeb also has some useful advice and help in these areas. They use things like POSSE, backfeed, and webmentions that allow site to site interaction that helps to solve the \"bring folks to our sites\" problem. https://indieweb.org/",
"html": "<a href=\"https://twitter.com/sophiaarichter\">@sophiaarichter</a> <a href=\"https://twitter.com/floatingtim\">@floatingtim</a> <a href=\"http://stream.boffosocko.com/tag/IndieWeb\" class=\"p-category\">#IndieWeb</a> also has some useful advice and help in these areas. They use things like POSSE, backfeed, and webmentions that allow site to site interaction that helps to solve the \"bring folks to our sites\" problem. <a href=\"https://indieweb.org/\">https://indieweb.org/</a>"
},
"author": {
"type": "card",
"name": "Chris Aldrich",
"url": "http://stream.boffosocko.com/profile/chrisaldrich",
"photo": "https://aperture-media.p3k.io/stream.boffosocko.com/d0ba9f65fcbf0cef3bdbcccc0b6a1f42b1310f7ab2e07208c7a396166cde26b1.jpg"
},
"_id": "200271",
"_source": "192",
"_is_read": true
}
The writing has long been on the wall: Twitter is going to destroy its third-party ecosystem. Why? To focus on controlling the entire experience for their proprietary platform. If you haven’t considered the #IndieWeb, now is the time. With services like Micro.blog available, it’s now easier than ever to participate in the open social web.
{
"type": "entry",
"published": "2018-04-06T17:22:45+00:00",
"url": "https://cleverdevil.io/2018/the-writing-has-long-been-on-the",
"category": [
"IndieWeb"
],
"syndication": [
"https://twitter.com/cleverdevil/status/982307638808662016"
],
"content": {
"text": "The writing has long been on the wall: Twitter is going to destroy its third-party ecosystem. Why? To focus on controlling the entire experience for their proprietary platform. If you haven\u2019t considered the #IndieWeb, now is the time. With services like Micro.blog available, it\u2019s now easier than ever to participate in the open social web.",
"html": "The writing has long been on the wall: Twitter is going to destroy its third-party ecosystem. Why? To focus on controlling the entire experience for their proprietary platform. If you haven\u2019t considered the <a href=\"https://cleverdevil.io/tag/IndieWeb\" class=\"p-category\">#IndieWeb</a>, now is the time. With services like Micro.blog available, it\u2019s now easier than ever to participate in the open social web."
},
"author": {
"type": "card",
"name": "Jonathan LaCour",
"url": "https://cleverdevil.io/profile/cleverdevil",
"photo": "https://aperture-media.p3k.io/cleverdevil.io/abdf4969f052cb64177f73cda9be6a709931eb55607f8c1fb2c69eb135841acf.jpg"
},
"_id": "198115",
"_source": "71",
"_is_read": true
}
I'm surprised that HTML5 support slipped through the cracks in PHP's DOM extension, but very glad Masterminds was there to cover for them. It's pretty easy to switch over and seems to be working well so far as a replacement HTML parser.
Only one quirk if you're not a composer fan, (the horror!) you need to write your own autoloader. That's fair enough given you're opting out of a controlled file system structure, and luckily it's super easy. Instead of including Mastermind's HTML5.php directly you just need your own snippet, something like:
<?php
include 'HTML5.php';
function html5_autoload($className) {
$file = 'library/' . str_replace('\', '/', $className) . '.php';
if (file_exists($file)) include $file;
}
spl_autoload_register('html5_autoload');
{
"type": "entry",
"published": "2018-04-06T13:54:25+10:00",
"url": "https://unicyclic.com/mal/2018-04-06-1455300078",
"category": [
"indieweb"
],
"content": {
"text": "I'm surprised that HTML5 support slipped through the cracks in PHP's DOM extension, but very glad Masterminds was there to cover for them. It's pretty easy to switch over and seems to be working well so far as a replacement HTML parser.\n\n\nOnly one quirk if you're not a composer fan, (the horror!) you need to write your own autoloader.\u00a0 That's fair enough given you're opting out of a controlled file system structure, and luckily it's super easy. Instead of including Mastermind's HTML5.php directly you just need your own snippet, something like:\n\n<?php\n\ninclude 'HTML5.php';\n\nfunction html5_autoload($className) {\n $file = 'library/' . str_replace('\\', '/', $className) . '.php';\n if (file_exists($file)) include $file;\n}\n\nspl_autoload_register('html5_autoload');",
"html": "I'm surprised that HTML5 support slipped through the cracks in PHP's DOM extension, but very glad <a href=\"https://github.com/Masterminds/html5-php\">Masterminds</a> was there to cover for them. It's pretty easy to switch over and seems to be working well so far as a replacement HTML parser.<br /><br />\nOnly one quirk if you're not a composer fan, (the horror!) you need to write your own autoloader.\u00a0 That's fair enough given you're opting out of a controlled file system structure, and luckily it's super easy. Instead of including Mastermind's HTML5.php directly you just need your own snippet, something like:\n<pre>\n<?php\n\ninclude 'HTML5.php';\n\nfunction html5_autoload($className) {\n $file = 'library/' . str_replace('\\', '/', $className) . '.php';\n if (file_exists($file)) include $file;\n}\n\nspl_autoload_register('html5_autoload');</pre>"
},
"author": {
"type": "card",
"name": "Malcolm Blaney",
"url": "https://unicyclic.com/mal",
"photo": "https://aperture-media.p3k.io/unicyclic.com/bdad1528925264a15ecd0bdb92bdc5836d965b0d5f4db8797489eec259fa32de.png"
},
"_id": "197048",
"_source": "243",
"_is_read": true
}
{
"type": "entry",
"author": {
"name": "Colin Walker",
"url": "https://colinwalker.blog/",
"photo": null
},
"url": "https://colinwalker.blog/nomention/",
"published": "2018-04-06T09:48:48+00:00",
"content": {
"html": "<p>There was a recent scenario where I linked to a conversation on micro.blog but, as well as my post being fed through as normal, the generated webmention was interpreted as a reply meaning the full post content showed as a separate response in the conversation.</p>\n<p>Micro.blog doesn't make the distinction between webmention types so I wondered about editing the webmention plugin for WordPress by adding <code>class=\"nomention\"</code> or <code>rel=\"nomention\"</code> to a link so that it isn't processed along similar lines to <code>rel=\"nofollow\"</code>.</p>\n<p>Unknown to me at the time, <a href=\"http://boffosocko.com/2018/01/24/definition-of-submention/\">Chris Aldrich</a> had also recently proposed <code>rel=\"nomention\"</code> but I personally prefer using <code>class</code> as I can easily add this in Drafts using multi-markdown which is supported by WordPress via JetPack:</p>\n<pre><code>[Link text](http://link.here) {.nomention}\n</code></pre>\n<p>Matthias Pfefferle (the plugin author) suggested building a blacklist feature so named domains could be excluded but I wouldn't necessarily want this to be the case, having it more at the individual link level so as not to completely preclude sending webmentions to particular sites.</p>\n<p>The webmention plugin for WordPress uses the function <code>wp_extract_urls()</code> to grab all links from the post content so I thought about replacing this, finding all link tags instead then, for each that doesn't include nomention, get the url. The initial code looks like this:</p>\n<pre><code>preg_match_all('/<a[^>]+>/i',$post->post_content, $results); \n\n$mentions = '';\nforeach ($results as $link) {\n if (strpos($link, 'nomention')) {\n $mentions .= $link;\n }\n}\n\n$links = wp_extract_urls($mentions);\n</code></pre>\n<p>It may be preferable to check the full <code>class=\"nomention\"</code> just in case the url includes \u201cnomention\u201d - you never know.</p>\n<p>While this works, as pointed out, there are some issues with it in its current form. Firstly, it only deals with <code><a></code> tags so ignores images, videos, etc. but it could be extended for multiple tags:</p>\n<pre><code>(<a[^>]+>|<img[^>]+>|...)\n</code></pre>\n<p>A more pressing issue, however, is that most people do not, and will not, manually edit the HTML in their posts, especially with the release of the WordPress Gutenberg editor on the horizon. As such, the application of the relevant flag would need to be via an option in the UI. This could be easily achieved at the post level (a \"do not send webmentions for this post\" checkbox) but not so at the link level.</p>\n<p>The question also arises as to whether something like micro.blog should better handle webmention types rather than automatically making everything a reply. As Chris suggests, however, there could be other scenarios where not sending a webmention is preferred.</p>\n<p>This is likely an extreme edge case (at least I'm not the only one who's considered it) but I thought it worth discussion even if dismissed.</p>\n<p>The implementation works for me and my posting workflow so I'll keep it, even though it will mean manually editing the plugin each time it is updated.</p>",
"text": "There was a recent scenario where I linked to a conversation on micro.blog but, as well as my post being fed through as normal, the generated webmention was interpreted as a reply meaning the full post content showed as a separate response in the conversation.\nMicro.blog doesn't make the distinction between webmention types so I wondered about editing the webmention plugin for WordPress by adding class=\"nomention\" or rel=\"nomention\" to a link so that it isn't processed along similar lines to rel=\"nofollow\".\nUnknown to me at the time, Chris Aldrich had also recently proposed rel=\"nomention\" but I personally prefer using class as I can easily add this in Drafts using multi-markdown which is supported by WordPress via JetPack:\n[Link text](http://link.here) {.nomention}\n\nMatthias Pfefferle (the plugin author) suggested building a blacklist feature so named domains could be excluded but I wouldn't necessarily want this to be the case, having it more at the individual link level so as not to completely preclude sending webmentions to particular sites.\nThe webmention plugin for WordPress uses the function wp_extract_urls() to grab all links from the post content so I thought about replacing this, finding all link tags instead then, for each that doesn't include nomention, get the url. The initial code looks like this:\npreg_match_all('/<a[^>]+>/i',$post->post_content, $results); \n\n$mentions = '';\nforeach ($results as $link) {\n if (strpos($link, 'nomention')) {\n $mentions .= $link;\n }\n}\n\n$links = wp_extract_urls($mentions);\n\nIt may be preferable to check the full class=\"nomention\" just in case the url includes \u201cnomention\u201d - you never know.\nWhile this works, as pointed out, there are some issues with it in its current form. Firstly, it only deals with <a> tags so ignores images, videos, etc. but it could be extended for multiple tags:\n(<a[^>]+>|<img[^>]+>|...)\n\nA more pressing issue, however, is that most people do not, and will not, manually edit the HTML in their posts, especially with the release of the WordPress Gutenberg editor on the horizon. As such, the application of the relevant flag would need to be via an option in the UI. This could be easily achieved at the post level (a \"do not send webmentions for this post\" checkbox) but not so at the link level.\nThe question also arises as to whether something like micro.blog should better handle webmention types rather than automatically making everything a reply. As Chris suggests, however, there could be other scenarios where not sending a webmention is preferred.\nThis is likely an extreme edge case (at least I'm not the only one who's considered it) but I thought it worth discussion even if dismissed.\nThe implementation works for me and my posting workflow so I'll keep it, even though it will mean manually editing the plugin each time it is updated."
},
"name": "Nomention",
"_id": "196804",
"_source": "237",
"_is_read": true
}
{
"type": "entry",
"published": "2018-04-06T00:52:40+00:00",
"url": "http://stream.boffosocko.com/2018/anyone-who-didnt-get-enough-of-open-domains-lab-this",
"category": [
"IndieWeb",
"DoOO"
],
"content": {
"text": "Anyone who didn't get enough of Open Domains Lab this afternoon is more than welcome to join in on the #IndieWeb chat to continue conversing and building their website. #DoOO\nhttps://twitter.com/TaylorJadin/status/979489643925295104\nhttps://indieweb.org/discuss",
"html": "Anyone who didn't get enough of Open Domains Lab this afternoon is more than welcome to join in on the <a href=\"http://stream.boffosocko.com/tag/IndieWeb\" class=\"p-category\">#IndieWeb</a> chat to continue conversing and building their website. <a href=\"http://stream.boffosocko.com/tag/DoOO\" class=\"p-category\">#DoOO</a><br /><a href=\"https://twitter.com/TaylorJadin/status/979489643925295104\">https://twitter.com/TaylorJadin/status/979489643925295104</a><br /><a href=\"https://indieweb.org/discuss\">https://indieweb.org/discuss</a>"
},
"author": {
"type": "card",
"name": "Chris Aldrich",
"url": "http://stream.boffosocko.com/profile/chrisaldrich",
"photo": "https://aperture-media.p3k.io/stream.boffosocko.com/d0ba9f65fcbf0cef3bdbcccc0b6a1f42b1310f7ab2e07208c7a396166cde26b1.jpg"
},
"_id": "195939",
"_source": "192",
"_is_read": true
}
{
"type": "entry",
"author": {
"name": "Colin Walker",
"url": "https://colinwalker.blog/",
"photo": null
},
"url": "https://colinwalker.blog/05-04-2018-2242/",
"published": "2018-04-05T22:43:15+00:00",
"content": {
"html": "<p>This is just a test to see if I can restrict which links webmentions are sent to:</p>\n<p>\u2013 <a href=\"https://colinwalker.blog/05-04-2018-1628/\">Link 1</a></p>\n<p>\u2013 <a href=\"https://colinwalker.blog/04-04-2018-1535/\">Link 2</a></p>",
"text": "This is just a test to see if I can restrict which links webmentions are sent to:\n\u2013 Link 1\n\u2013 Link 2"
},
"_id": "195630",
"_source": "237",
"_is_read": true
}
@spigot Yes, you have to publish first. Looks like there's something putting a lot of hidden svg files into your content. A plugin perhaps?
You can also find some potential help in the chat at https://chat.indieweb.org/wordpress/
{
"type": "entry",
"published": "2018-04-05T20:42:57+00:00",
"url": "http://stream.boffosocko.com/2018/spigot-yes-you-have-to-publish-first-looks-like-theres",
"syndication": [
"https://twitter.com/ChrisAldrich/status/981995652954673153"
],
"in-reply-to": [
"https://twitter.com/spigot/status/981991766415978496"
],
"content": {
"text": "@spigot Yes, you have to publish first. Looks like there's something putting a lot of hidden svg files into your content. A plugin perhaps?\n\nYou can also find some potential help in the chat at https://chat.indieweb.org/wordpress/",
"html": "<a href=\"https://twitter.com/spigot\">@spigot</a> Yes, you have to publish first. Looks like there's something putting a lot of hidden svg files into your content. A plugin perhaps?<br />\nYou can also find some potential help in the chat at <a href=\"https://chat.indieweb.org/wordpress/\">https://chat.indieweb.org/wordpress/</a>"
},
"author": {
"type": "card",
"name": "Chris Aldrich",
"url": "http://stream.boffosocko.com/profile/chrisaldrich",
"photo": "https://aperture-media.p3k.io/stream.boffosocko.com/d0ba9f65fcbf0cef3bdbcccc0b6a1f42b1310f7ab2e07208c7a396166cde26b1.jpg"
},
"_id": "195243",
"_source": "192",
"_is_read": true
}
{
"type": "entry",
"published": "2018-04-05T20:15:48+00:00",
"url": "http://stream.boffosocko.com/2018/spigot-its-not-bridgy-instead-it-looks-like-its-caused",
"syndication": [
"https://twitter.com/ChrisAldrich/status/981988822446673920"
],
"in-reply-to": [
"https://twitter.com/spigot/status/981957032537305088"
],
"content": {
"text": "@spigot It's not Brid.gy, instead it looks like it's caused by the microformats and their placement in your particular theme. https://brid.gy/about/#microformats\n\nYou can find a preview of what Bridgy will publish at https://brid.gy/twitter/spigot if you want to test before sending.",
"html": "<a href=\"https://twitter.com/spigot\">@spigot</a> It's not Brid.gy, instead it looks like it's caused by the microformats and their placement in your particular theme. <a href=\"https://brid.gy/about/#microformats\">https://brid.gy/about/#microformats</a><br />\nYou can find a preview of what Bridgy will publish at <a href=\"https://brid.gy/twitter/spigot\">https://brid.gy/twitter/spigot</a> if you want to test before sending."
},
"author": {
"type": "card",
"name": "Chris Aldrich",
"url": "http://stream.boffosocko.com/profile/chrisaldrich",
"photo": "https://aperture-media.p3k.io/stream.boffosocko.com/d0ba9f65fcbf0cef3bdbcccc0b6a1f42b1310f7ab2e07208c7a396166cde26b1.jpg"
},
"_id": "195244",
"_source": "192",
"_is_read": true
}
{
"type": "entry",
"published": "2018-04-05T12:14:11+0000",
"url": "http://known.kevinmarks.com/2018/digital-catapult-open-call-for-future-social",
"category": [
"indieweb"
],
"syndication": [
"https://twitter.com/kevinmarks/status/981867598789791744"
],
"content": {
"text": "Digital Catapult open call for Future Social Media pitches: https://www.digitalcatapultcentre.org.uk/open-calls/future-social-media-enhancing-cohesion-through-a... - in EU with \"a project, innovation or solution using social media that aims to enhance cohesion between technology providers, creative content makers, their users/adopters and research\" #indieweb",
"html": "Digital Catapult open call for Future Social Media pitches: <a href=\"https://www.digitalcatapultcentre.org.uk/open-calls/future-social-media-enhancing-cohesion-through-advanced-digital-technology/\">https://www.digitalcatapultcentre.org.uk/open-calls/future-social-media-enhancing-cohesion-through-a...</a> - in EU with \"a project, innovation or solution using social media that aims to enhance cohesion between technology providers, creative content makers, their users/adopters and research\" <a href=\"http://known.kevinmarks.com/tag/indieweb\" class=\"p-category\">#indieweb</a>"
},
"author": {
"type": "card",
"name": "Kevin Marks",
"url": "http://known.kevinmarks.com/profile/kevinmarks",
"photo": "https://aperture-media.p3k.io/known.kevinmarks.com/f893d11435a62200ec9585e0ea3d84b2bdc478aa0a056dda35a43ce4c04d58a0.jpg"
},
"_id": "193766",
"_source": "205",
"_is_read": true
}
“I think we will dig through this hole, but it will take a few years,” Zuckerberg said.
Has Zuck been so isolated from criticism that no one ever said "when you're in a hole, stop digging" to him? #indiewebhttps://t.co/tYEivTKUQd?amp=1
{
"type": "entry",
"published": "2018-04-02T10:49:15+0000",
"url": "http://known.kevinmarks.com/2018/i-think-we-will-dig-through-this",
"category": [
"indieweb"
],
"syndication": [
"https://twitter.com/kevinmarks/status/980759059543465984"
],
"content": {
"text": "\u201cI think we will dig through this hole, but it will take a few years,\u201d Zuckerberg said.\n\n\nHas Zuck been so isolated from criticism that no one ever said "when you're in a hole, stop digging" to him? #indieweb https://t.co/tYEivTKUQd?amp=1",
"html": "\u201cI think we will dig through this hole, but it will take a few years,\u201d Zuckerberg said.<br /><br />\nHas Zuck been so isolated from criticism that no one ever said &quot;when you're in a hole, stop digging&quot; to him? <a href=\"http://known.kevinmarks.com/tag/indieweb\" class=\"p-category\">#indieweb</a> <a href=\"https://t.co/tYEivTKUQd?amp=1\">https://t.co/tYEivTKUQd?amp=1</a>"
},
"author": {
"type": "card",
"name": "Kevin Marks",
"url": "http://known.kevinmarks.com/profile/kevinmarks",
"photo": "https://aperture-media.p3k.io/known.kevinmarks.com/f893d11435a62200ec9585e0ea3d84b2bdc478aa0a056dda35a43ce4c04d58a0.jpg"
},
"_id": "182497",
"_source": "205",
"_is_read": true
}
{
"type": "entry",
"published": "2018-04-02 00:15-0700",
"url": "http://tantek.com/2018/092/t1/reduce-use-of-facebook",
"content": {
"text": "@rolandturner great post. Insightful, agreed on \"two fronts\".\nRe: reduce use of Facebook, some steps: https://indieweb.org/Facebook#How_to_wean_yourself_from\n\nHere\u2019s also a page to help reduce appaccess to FB login, more: https://indieweb.org/appaccess\n\nReply sent from my indieweb site",
"html": "<a class=\"h-cassis-username\" href=\"https://twitter.com/rolandturner\">@rolandturner</a> great post. Insightful, agreed on \"two fronts\".<br />Re: reduce use of Facebook, some steps: <a href=\"https://indieweb.org/Facebook#How_to_wean_yourself_from\">https://indieweb.org/Facebook#How_to_wean_yourself_from</a><br /><br />Here\u2019s also a page to help reduce appaccess to FB login, more: <a href=\"https://indieweb.org/appaccess\">https://indieweb.org/appaccess</a><br /><br />Reply sent from my indieweb site"
},
"author": {
"type": "card",
"name": "Tantek \u00c7elik",
"url": "http://tantek.com/",
"photo": "https://aperture-media.p3k.io/tantek.com/acfddd7d8b2c8cf8aa163651432cc1ec7eb8ec2f881942dca963d305eeaaa6b8.jpg"
},
"_id": "183434",
"_source": "1",
"_is_read": true
}
{
"type": "entry",
"published": "2018-04-01T16:42:00-07:00",
"url": "https://aaronparecki.com/2018/04/01/39/q1-review",
"category": [
"review"
],
"name": "First Quarter 2018 in Review",
"content": {
"text": "JanuaryEvents\nWent to Baltimore to help put on IndieWebCamp! It was a lot of fun, and I even added a couple fun things to my website during the second day.\nI also filmed the talks at the\u00a0DonutJS\u00a0meetup.\nPodcasts\nI managed to publish only one episode of my podcast, Percolator, just before heading to Baltimore.\nWe launched applications for the StreamPDX Podcast Fellowship Program in January! We received way more applications than we expected!\nIndieWeb Projects\nWe published the final version of WebSub, and the IndieAuth note, on w3.org! Thanks to the hard work of the Social Web Working Group for all their contributions!\nMy Website\nI made several improvements to my website during January!\n\n Updated my Life Stack post\n \n\n Launched the collaborative pixel art on my home page\n Added support for posting code snippets to my site to switch off of gist.github.com\n Added a summary of my blog post archives so the archive pages are easier to use\n \n Added meta tags so my site looks better when\u00a0links are posted to\u00a0Twitter/Facebook/Slack\n \n\nOther Stuff\nFinally got my OAuth 2.0 book launched for Kindle! It turns out that the Kindle requirements made it a lot more work than just uploading the existing ePub version.\nFebruaryEvents\nOkta hosted Iterate 2018, where I went and had a great time chatting with people about OAuth and giving out copies of my book.\nPodcasts\nAgain I managed to publish only one episode of Percolator during the month.\nThe StreamPDX team reviewed all the applications to the fellowship program, and it was really tough to narrow them down! We ended up inviting a handful of people in for interviews, and chose 4 out of that group. I began working with them on their podcasts, making pretty good progress the first few weeks!\nI also taught the first Publishing your Podcast class of the season.\nIndieWeb Projects\nI made a lot of progress on my new IndieWeb reader during February! I wanted to get it in shape enough to use it during the conferences I was attending. I decided to split it into two parts, a Microsub server (Aperture) with no UI for viewing posts, and a separate client that has no storage backend of its own (Monocle).\nI added a minor feature to OwnYourGram, made some minor changes to XRay, and released an updated version of the PHP IndieAuth client.\nMarch\nBig news in March! I accepted a full-time job at Okta! I've been working with Okta for quite some time now, but always part time as a contractor. I've written up more about what I'll be doing at Okta on the Okta Developers blog!\nEvents\n\n Co-hosted the first Homebrew Microblog Meetup with Jean\n \n\n Went to the PDXNode Hack Night\u00a0and did a lightning talk about Monocle, my IndieWeb reader\n \n Filmed the talks at DonutJS, but had some technical issues with the audio, so those videos aren't nearly as good this month\n \n\nIndieWeb Projects\nI made lots of progress on Monocle, getting it to a point where I now can use it every day as my primary home on the Internet. I wrote a blog post describing how everything works, Building an IndieWeb Reader.\nThanks to the hard work of gRegor and Martijn, we were able to get a new release of the PHP Microformats parser out the door! This is now in use by Monocle, which should improve a lot of the feeds it's seeing.\nPodcasts\nPercolator is turning into a monthly podcast, as I managed to again get only one episode out during March.\nThe StreamPDX fellowship program is continuing, I've been writing some music for one of the podcasts which has been fun, but a lot of work.\nWe brought the StreamPDX trailer to the Portland Art Museum to record audio during an event!\nI taught another session of Publishing Your Podcast.\n\n Other Stuff\n \n\nI finally set up an account at exist.io! I went through the list of all their supported integrations, and decided to customize a bunch of them.\nSince I post notes and photos to places other than Twitter and Instagram, my website is the canonical source of my tweets and photos as well as the responses I get from them. I was able to use the Exist API to take over writing those values and now my Tweet/Instagram counts in Exist actually reflect the notes and photos I post to my own website.\nI noticed they also support tracking miles biked, so since my bike rides are already on my website, I set up a script to push that data to Exist!\nI also track other kinds of transport, and decided to use their custom tracking to visualize that per day. So now I can see at a glance which days I was on a bike, in a taxi, on a train, etc! It's pretty neat looking already, and I'm hoping they'll be able to be used in some insights later!\nThe Exist API even has a section for tracking money spent, although they don't integrate with YNAB (yet!). I got beta access to the YNAB API and was able to wire it up to report my spending from certain budget categories into Exist!",
"html": "<h2>January</h2><h3>Events</h3>\n<p>Went to Baltimore to help put on IndieWebCamp! It was a lot of fun, and I even <a href=\"https://aaronparecki.com/2018/01/21/11/pixel-art\">added a couple fun things</a> to my website during the second day.</p>\n<img src=\"https://aperture-media.p3k.io/aaronparecki.com/fbae6fee884dbf6d882cdaea99b315255efa12ddad58f0a43cdd85e6c5a270a9.jpg\" alt=\"\" /><p>I also filmed the talks at the\u00a0<a href=\"https://www.youtube.com/watch?v=IuPGpAuGYlo&list=PLclEcT4yxER4RtyVkT8mJ58aNG6smYsaD\">DonutJS</a>\u00a0meetup.</p>\n<h3>Podcasts</h3>\n<p>I managed to publish only one <a href=\"https://percolator.today/episode/17\">episode of my podcast</a>, Percolator, just before heading to Baltimore.</p>\n<p>We launched applications for the StreamPDX Podcast Fellowship Program in January! We received way more applications than we expected!</p>\n<h3>IndieWeb Projects</h3>\n<p>We <a href=\"https://aaronparecki.com/2018/01/23/34/w3c-websub-indieauth\">published the final version</a> of WebSub, and the IndieAuth note, on w3.org! Thanks to the hard work of the Social Web Working Group for all their contributions!</p>\n<img src=\"https://aperture-media.p3k.io/aaronparecki.com/059fdc6eca9cad00118e7ae8b214d5a21dd0e56cf09b5596815bd5e011dc6c52.jpg\" alt=\"\" /><h3>My Website</h3>\n<p>I made several improvements to my website during January!</p>\n<ul><li>\n Updated my <a href=\"https://aaronparecki.com/life-stack/\">Life Stack</a> post\n <br /></li>\n <li>Launched the <a href=\"https://aaronparecki.com/2018/01/21/11/pixel-art\">collaborative pixel art</a> on my home page</li>\n <li>Added support for <a href=\"https://aaronparecki.com/2018/01/06/13/code-snippets\">posting code snippets</a> to my site to switch off of gist.github.com</li>\n <li>Added <a href=\"https://aaronparecki.com/2018/01/06/12/blog-archives\">a summary of my blog post archives</a> so the archive pages are easier to use</li>\n <li>\n Added meta tags so my site looks better when\u00a0<a href=\"https://aaronparecki.com/2018/01/05/10/rich-link-previews\">links are posted</a> to\u00a0Twitter/Facebook/Slack\n <br /></li>\n</ul><h3>Other Stuff</h3>\n<p>Finally got my OAuth 2.0 book <a href=\"https://aaronparecki.com/2018/01/09/4/oauth2-simplified-kindle\">launched for Kindle</a>! It turns out that the Kindle requirements made it a lot more work than just uploading the existing ePub version.</p>\n<h2>February</h2><h3>Events</h3>\n<p>Okta hosted <a href=\"https://developer.okta.com/blog/2018/03/14/hosting-our-first-developer-conference-iterate\">Iterate 2018</a>, where I went and had a great time chatting with people about OAuth and giving out copies of my book.</p>\n<h3>Podcasts</h3>\n<p>Again I managed to publish only <a href=\"https://percolator.today/episode/18\">one episode</a> of Percolator during the month.</p>\n<p>The StreamPDX team reviewed all the applications to the fellowship program, and it was really tough to narrow them down! We ended up inviting a handful of people in for interviews, and chose 4 out of that group. I began working with them on their podcasts, making pretty good progress the first few weeks!</p>\n<p>I also taught the first <a href=\"https://streampdx.com/classes\">Publishing your Podcast</a> class of the season.</p>\n<h3>IndieWeb Projects</h3>\n<p>I made a lot of progress on my new IndieWeb reader during February! I wanted to get it in shape enough to use it during the conferences I was attending. I decided to split it into two parts, a Microsub server (Aperture) with no UI for viewing posts, and a separate client that has no storage backend of its own (Monocle).</p>\n<p>I added a minor feature to <a href=\"https://aaronparecki.com/2018/02/05/6/ownyourgram\">OwnYourGram</a>, made some minor changes to <a href=\"https://github.com/aaronpk/XRay\">XRay</a>, and released an updated version of the <a href=\"https://aaronparecki.com/2018/02/07/7/indieauth\">PHP IndieAuth client</a>.</p>\n<h2>March</h2>\n<p>Big news in March! I accepted a full-time job at Okta! I've been working with Okta for quite some time now, but always part time as a contractor. I've written up more about <a href=\"https://developer.okta.com/blog/2018/03/27/welcome-aaron-okta\">what I'll be doing at Okta</a> on the Okta Developers blog!</p>\n<h3>Events</h3>\n<ul><li>\n Co-hosted the first <a href=\"https://indieweb.org/events/2018-03-14-homebrew-microblog\">Homebrew Microblog Meetup</a> with <a href=\"http://jeanmacdonald.me/\">Jean</a>\n <br /></li>\n <li>Went to the <a href=\"https://aaronparecki.com/2018/03/29/46/pdxnode\">PDXNode Hack Night</a>\u00a0and did a lightning talk about Monocle, my IndieWeb reader</li>\n <li>\n Filmed the talks at <a href=\"https://www.youtube.com/watch?v=77rk1uYzayM&list=PLclEcT4yxER4dPvNgw8n-aQlcA1AtgrI2\">DonutJS</a>, but had some technical issues with the audio, so those videos aren't nearly as good this month\n <br /></li>\n</ul><h3>IndieWeb Projects</h3>\n<p>I made lots of progress on Monocle, getting it to a point where I now can use it every day as my primary home on the Internet. I wrote a blog post describing how everything works, <a href=\"https://aaronparecki.com/2018/03/12/17/building-an-indieweb-reader\">Building an IndieWeb Reader</a>.</p>\n<img src=\"https://aperture-media.p3k.io/aaronparecki.com/9baeb7c64cd56a693cedc7f5e0c4adc3283755e70506343c8ea29e4c18a8b3b8.jpg\" alt=\"\" /><p>Thanks to the hard work of <a href=\"https://gregorlove.com/\">gRegor</a> and <a href=\"https://vanderven.se/martijn/\">Martijn</a>, we were able to get <a href=\"https://aaronparecki.com/2018/03/29/6/php-mf2\">a new release</a> of the PHP Microformats parser out the door! This is now in use by Monocle, which should improve a lot of the feeds it's seeing.</p>\n<h3>Podcasts</h3>\n<p>Percolator is turning into a monthly podcast, as I managed to again get only <a href=\"https://percolator.today/episode/19\">one episode</a> out during March.</p>\n<p>The StreamPDX fellowship program is continuing, I've been writing some music for one of the podcasts which has been fun, but a lot of work.</p>\n<p>We <a href=\"https://aaronparecki.com/2018/03/10/34/\">brought the StreamPDX trailer</a> to the Portland Art Museum to record audio during an event!</p>\n<p>I taught another session of <a href=\"https://streampdx.com/classes\">Publishing Your Podcast</a>.</p>\n<h3>\n Other Stuff\n <br /></h3>\n<p>I finally set up an account at <a href=\"https://exist.io/?referred_by=aaronpk\">exist.io</a>! I went through the list of all their supported integrations, and decided to customize a bunch of them.</p>\n<p>Since I post notes and photos to places other than Twitter and Instagram, my website is the canonical source of my tweets and photos as well as the responses I get from them. I was able to use the Exist API to take over writing those values and now my Tweet/Instagram counts in Exist actually reflect the notes and photos I post to my own website.</p>\n<img src=\"https://aperture-media.p3k.io/aaronparecki.com/d9fee254c6d5fd4f55be1e83d027e9922f9a041b6d20f853ab77a409a8bd0151.png\" alt=\"\" /><p>I noticed they also support tracking miles biked, so since <a href=\"https://aaronparecki.com/rides\">my bike rides</a> are already on my website, I set up a script to push that data to Exist!</p>\n<p>I also track other kinds of transport, and decided to use their custom tracking to visualize that per day. So now I can see at a glance which days I was on a bike, in a taxi, on a train, etc! It's pretty neat looking already, and I'm hoping they'll be able to be used in some insights later!</p>\n<img src=\"https://aperture-media.p3k.io/aaronparecki.com/e735cb45380fa6d10f638d7e6e08c4b7213c65612c0b9a405fd8703757d345c6.png\" alt=\"\" /><p>The Exist API even has a section for tracking money spent, although they don't integrate with YNAB (yet!). I got beta access to the YNAB API and was able to wire it up to report my spending from certain budget categories into Exist!</p>\n<img src=\"https://aperture-media.p3k.io/aaronparecki.com/09230a9843d3467f3084a18b4850c3b7e84c669cba03cfa2565fe33cc18ab9c5.png\" alt=\"\" />"
},
"author": {
"type": "card",
"name": "Aaron Parecki",
"url": "https://aaronparecki.com/",
"photo": "https://aperture-media.p3k.io/aaronparecki.com/2b8e1668dcd9cfa6a170b3724df740695f73a15c2a825962fd0a0967ec11ecdc.jpg"
},
"_id": "181608",
"_source": "16",
"_is_read": true
}
I'd like to be able to consume content from https://micro.blog and comment on peoples posts and get back their comments but without actually the need to use their software. It's kind of close because they use webmentions and I've seen some links to RSS feeds, but the UI if you're not logged in is quite awful.
I think they would get much more traction if they did some homework on not putting up those walls around their garden like everyone else does. I'll follow this one guy now and we'll see if I can get into this community with just my own software and without signing up for a username there.
{
"type": "entry",
"published": "2018-04-01T18:45:42Z",
"url": "https://jeena.net/notes/913",
"content": {
"text": "I'd like to be able to consume content from https://micro.blog and comment on peoples posts and get back their comments but without actually the need to use their software. It's kind of close because they use webmentions and I've seen some links to RSS feeds, but the UI if you're not logged in is quite awful.\n\nFor example, I found https://micro.blog/jthingelstad by randomly typing https://manton.micro.blog/ which turned out to be the test blog of the creator of Micro.blog all the way down in the footer I found a link to https://micro.blog/manton which for some reason has compleatly different content than the subdomain. There I saw him mentioning https://micro.blog/jthingelstad so I rewrote the URL to https://jthingelstad.micro.blog and was looking for a link to their RSS. The footer didn't have one but the HTML head has one which luckily my browser shows. https://micro.blog/jthingelstad didn't have a link to that RSS feed nor to the subdomain where I can find the link to the RSS feed.\n\nI think they would get much more traction if they did some homework on not putting up those walls around their garden like everyone else does. I'll follow this one guy now and we'll see if I can get into this community with just my own software and without signing up for a username there.",
"html": "<p></p><p>I'd like to be able to consume content from <a href=\"https://micro.blog\">https://micro.blog</a> and comment on peoples posts and get back their comments but without actually the need to use their software. It's kind of close because they use webmentions and I've seen some links to RSS feeds, but the UI if you're not logged in is quite awful.</p>\n\n<p>For example, I found <a href=\"https://micro.blog/jthingelstad\">https://micro.blog/jthingelstad</a> by randomly typing <a href=\"https://manton.micro.blog/\">https://manton.micro.blog/</a> which turned out to be the test blog of the creator of Micro.blog all the way down in the footer I found a link to <a href=\"https://micro.blog/manton\">https://micro.blog/manton</a> which for some reason has compleatly different content than the subdomain. There I saw him mentioning <a href=\"https://micro.blog/jthingelstad\">https://micro.blog/jthingelstad</a> so I rewrote the URL to <a href=\"https://jthingelstad.micro.blog\">https://jthingelstad.micro.blog</a> and was looking for a link to their RSS. The footer didn't have one but the HTML head has one which luckily my browser shows. <a href=\"https://micro.blog/jthingelstad\">https://micro.blog/jthingelstad</a> didn't have a link to that RSS feed nor to the subdomain where I can find the link to the RSS feed.</p>\n\n<p>I think they would get much more traction if they did some homework on not putting up those walls around their garden like everyone else does. I'll follow this one guy now and we'll see if I can get into this community with just my own software and without signing up for a username there.</p>"
},
"author": {
"type": "card",
"name": "Jeena",
"url": "https://jeena.net/",
"photo": "https://aperture-media.p3k.io/jeena.net/d265fd0a7b0bc15c7d4df4534b596d15b6039da1eab9482dda49db1a62fe1919.jpg"
},
"_id": "180295",
"_source": "201",
"_is_read": true
}
{
"type": "entry",
"published": "2018-03-31T19:52:40-04:00",
"url": "https://martymcgui.re/2018/03/31/195240/",
"category": [
"podcast",
"IndieWeb",
"this-week-indieweb-podcast"
],
"audio": [
"https://aperture-media.p3k.io/media.martymcgui.re/25afd1269b03299149434c9eef9696221d42ac4cd70e5be7a40954096ca27158.mp3"
],
"syndication": [
"https://huffduffer.com/schmarty/468889",
"https://twitter.com/schmarty/status/980232138049060865",
"https://www.facebook.com/marty.mcguire.54/posts/10211755682749518"
],
"name": "This Week in the IndieWeb Audio Edition \u2022 March 24th - 30th, 2018",
"content": {
"text": "Show/Hide Transcript \n \n Audio edition for This Week in the IndieWeb for March 24th - 30th, 2018.\n\nYou can find all of my audio editions and subscribe with your favorite podcast app here: martymcgui.re/podcasts/indieweb/.\n\nMusic from Aaron Parecki\u2019s 100DaysOfMusic project: Day 85 - Suit, Day 48 - Glitch, Day 49 - Floating, Day 9, and Day 11\n\nThanks to everyone in the IndieWeb chat for their feedback and suggestions. Please drop me a note if there are any changes you\u2019d like to see for this audio edition!",
"html": "Show/Hide Transcript \n \n <p>Audio edition for <a href=\"https://indieweb.org/this-week/2018-03-30.html\">This Week in the IndieWeb for March 24th - 30th, 2018</a>.</p>\n\n<p>You can find all of my audio editions and subscribe with your favorite podcast app here: <a href=\"https://martymcgui.re/podcasts/indieweb/\">martymcgui.re/podcasts/indieweb/</a>.</p>\n\n<p>Music from <a href=\"https://aaronparecki.com/\">Aaron Parecki</a>\u2019s <a href=\"https://100.aaronparecki.com/\">100DaysOfMusic project</a>: <a href=\"https://aaronparecki.com/2017/03/15/14/day85\">Day 85 - Suit</a>, <a href=\"https://aaronparecki.com/2017/02/06/7/day48\">Day 48 - Glitch</a>, <a href=\"https://aaronparecki.com/2017/02/07/4/day49\">Day 49 - Floating</a>, <a href=\"https://aaronparecki.com/2016/12/29/21/day-9\">Day 9</a>, and <a href=\"https://aaronparecki.com/2016/12/31/15/\">Day 11</a></p>\n\n<p>Thanks to everyone in the <a href=\"https://chat.indieweb.org/\">IndieWeb chat</a> for their feedback and suggestions. Please drop me a note if there are any changes you\u2019d like to see for this audio edition!</p>"
},
"author": {
"type": "card",
"name": "Marty McGuire",
"url": "https://martymcgui.re/",
"photo": "https://aperture-media.p3k.io/martymcgui.re/4f9fac2b9e3ae62998c557418143efe288bca8170a119921a9c6bfeb0a1263a2.jpg"
},
"_id": "178680",
"_source": "175",
"_is_read": true
}
This is a great analysis on the ability to import and export posts via Micropub/mf2. The benefit of that is even private posts can be exported! I definitely want to think through this more
{
"type": "entry",
"published": "2018-03-31T20:00:05-04:00",
"summary": "This is a great analysis on the ability to import and export posts via Micropub/mf2. The benefit of that is even private posts can be exported! I definitely want to think through this more",
"url": "https://eddiehinkle.com/2018/03/31/10/reply/",
"category": [
"indieweb",
"micropub"
],
"in-reply-to": [
"https://overcast.fm/ Io4VpyYPk/32:42"
],
"content": {
"text": "This is a great analysis on the ability to import and export posts via Micropub/mf2. The benefit of that is even private posts can be exported! I definitely want to think through this more",
"html": "<p>This is a great analysis on the ability to import and export posts via Micropub/mf2. The benefit of that is even private posts can be exported! I definitely want to think through this more</p>"
},
"author": {
"type": "card",
"name": "Eddie Hinkle",
"url": "https://eddiehinkle.com/",
"photo": "https://aperture-media.p3k.io/eddiehinkle.com/cf9f85e26d4be531bc908d37f69bff1c50b50b87fd066b254f1332c3553df1a8.jpg"
},
"refs": {
"https://overcast.fm/ Io4VpyYPk/32:42": {
"type": "entry",
"url": "https://overcast.fm/ Io4VpyYPk/32:42",
"name": "https://overcast.fm/ Io4VpyYPk/32:42"
}
},
"_id": "178544",
"_source": "226",
"_is_read": true
}
{
"type": "entry",
"published": "2018-03-30T20:10:41+00:00",
"url": "https://cleverdevil.io/2018/going-serverless-with-python-wsgi-apps",
"syndication": [
"https://twitter.com/cleverdevil/status/979815338085879809"
],
"name": "Going Serverless with Python WSGI Apps",
"content": {
"text": "I've been writing web applications and services in Python since the late 1990s, and enjoy it so much that I created the Pecan web application framework way back in 2010. Configuring and deploying Python web applications, especially WSGI compliant applications, is fairly straightforward, with great WSGI servers like Gunicorn and uWSGI, and excellent Apache integration via mod_wsgi. But, for many use cases, creating and maintaining one or more cloud servers creates unnecessary cost and complexity. Security patches, kernel upgrades, SSL certificate management, and more, can be a real burden.Since the creation of AWS Lambda, \"serverless\" has become a pretty popular buzzword. Could Lambda provide a way to deploy Python WSGI applications that helps reduce cost, complexity, and management overhead? First, let's consider what serverless really means.Introducing LambdaAWS Lambda is a cloud service that lets developers deploy and run code without provisioning or managing servers. Under the hood, there is of course still a server where code is run, but its existence is largely abstracted away. Lambda, and other services in the category, are likely better defined as \"functions as a service\" (FaaS).Lambda provides built-in Python support, and invoking Lambda functions can be done manually, or via an event triggered by an another AWS service, including Amazon S3, Amazon DynamoDB, and even Amazon Alexa, just to name a few.Lambda functions can also be invoked via HTTP through the use of the Amazon API Gateway, which opens up the possibility that WSGI applications could be exposed through Lambda. That said, the complexity of setting up a WSGI application to run within a Lambda execution environment is daunting.The Serverless FrameworkEnter the Serverless Framework, a toolkit for creating, managing, deploying, and operating serverless architectures. Serverless supports AWS Lambda, and other FaaS platforms, and makes the process of getting your code deployed to Lambda much easier. Serverless is written in JavaScript, and is easily installable through npm:$ npm install serverless -gOnce installed, you can use the serverless tool from the command line to perform a whole host of tasks, such as creating new functions from templates, deploying functions to providers, and invoking functions directly.Serverless WSGIThe serverless-wsgi plugin for the Serverless Framework allows you to take any Python WSGI application, and deploy it to Lambda with ease. Let's take a look at how!I've been working on a Python-based IndieAuth implementation called PunyAuth for a few weeks, and as an infrequently accessed web service, its a perfect candidate for a FaaS-backed deployment.First, I installed the serverless-wsgi plugin:$ npm install serverless-wsgi -gThen, I created a file called punywsgi.py that exposes PunyAuth as a WSGI application:from pecan.deploy import deploy\napp = deploy('my-config.py')In order to bundle up PunyAuth and all of its dependencies, serverless-wsgi needs a requirements.txt file, which is easily done using pip:$ pip freeze > requirements.txtFinally, I created a serverless.yml file that defines the service:service: serverless-punyauth\n\nplugins:\n - serverless-wsgi\n\ncustom:\n wsgi:\n app: punywsgi.app\n\nprovider:\n name: aws\n runtime: python3.6\n region: us-east-1\n iamRoleStatements:\n - Effect: \"Allow\"\n Action:\n - s3:*\n Resource:\n - arn:aws:s3:::cleverdevil-punyauth-testing/*\n\nfunctions:\n app:\n handler: wsgi.handler\n events:\n - http: ANY /\n - http: 'ANY {proxy+}'The serverless.yml file declares a service called serverless-punyauth, enables the serverless-wsgi plugin, and directs it to expose the WSGI app defined in punywsgi.app. When using serverless-wsgi, the bundled wsgi.handler can automatically map requests and responses coming in through the Amazon API Gateway to the deployed WSGI app.In the case of PunyAuth, the function itself needs read/write access to a particular AWS S3 bucket, which is accomplished here by defining an AWS IAM role that explicitly grants this access.At this point, the application is ready to be deployed to AWS Lambda.$ serverless deploy\nServerless: Packaging Python WSGI handler...\nServerless: Packaging required Python packages...\nServerless: Linking required Python packages...\nServerless: Packaging service...\nServerless: Excluding development dependencies...\nServerless: Unlinking required Python packages...\nServerless: Uploading CloudFormation file to S3...\nServerless: Uploading artifacts...\nServerless: Uploading service .zip file to S3 (2.08 MB)...\nServerless: Validating template...\nServerless: Updating Stack...\nServerless: Checking Stack update progress...\n................\nServerless: Stack update finished...\nService Information\nservice: serverless-punyauth\nstage: dev\nregion: us-east-1\nstack: serverless-punyauth-dev\napi keys:\n None\nendpoints:\n ANY - https://rpmchol040.execute-api.us-east-1.amazonaws.com/dev\n ANY - https://rpmchol040.execute-api.us-east-1.amazonaws.com/dev/{proxy+}\nfunctions:\n app: serverless-punyauth-dev-app\nServerless: Removing old service versions...Tada! We've deployed PunyAuth as a Lambda function!Use Cases and BenefitsDeploying WSGI applications on Lambda is certainly cool, but its also not appropriate for all use cases. Typically, WSGI applications are deployed in always-running WSGI servers. With Lambda, the behind-the-scenes server that represents the environment for your application is magically started and stopped on an as-needed basis, and the application itself will need to be loaded during the function invokation on-demand. This adds some additional overhead, so in the case of high-performance or frequently-accessed applications, you'll likely want to go another route.That said, for applications like PunyAuth, where performance isn't super critical, and the application is accessed relatively infrequently, this approach has a multitude of benefits.Benefit: CostDeploying a Python WSGI application the traditional way, with always-on infrastructure, will certainly result in higher performance, but also in significantly higher cost. With Lambda, you only pay for the actual execution time of your functions, rather than paying for, say, an EC2 instance that is always on. That means that hosting a low-traffic WSGI app in Lambda could cost you pennies a month.Benefit: ManagementWhile servers have certainly become easier to manage over the years, and managed hosting providers exist that will handle operating system updates and security patches, there's no question that deploying to Lambda will reduce your management overhead. The Lambda execution environment is entirely managed by Amazon, allowing you to focus on your application code, rather than on managing a fleet of servers.Benefit: SecurityWith AWS handling the heavy lifting of keeping the execution environment up-to-date with security patches, and the ability to apply fine-grained controls using AWS IAM roles, keeping your application secure is a bit easier.ConclusionAWS Lambda and the Serverless Framework provide a whole new way to host Python WSGI applications that can help reduce cost, eliminate management, and improve security.",
"html": "<p>I've been writing web applications and services in <a href=\"http://www.python.org\">Python</a> since the late 1990s, and enjoy it so much that I created the <a href=\"https://github.com/pecan/pecan\">Pecan web application framework</a> way back in 2010. Configuring and deploying Python web applications, especially <a href=\"http://wsgi.readthedocs.io/en/latest/\">WSGI</a> compliant applications, is fairly straightforward, with great WSGI servers like <a href=\"http://gunicorn.org\">Gunicorn</a> and <a href=\"http://projects.unbit.it/uwsgi\">uWSGI</a>, and excellent Apache integration via <a href=\"http://www.modwsgi.org/\">mod_wsgi</a>. But, for many use cases, creating and maintaining one or more cloud servers creates unnecessary cost and complexity. Security patches, kernel upgrades, SSL certificate management, and more, can be a real burden.</p><p>Since the creation of AWS <a href=\"https://aws.amazon.com/lambda/\">Lambda</a>, \"serverless\" has become a pretty popular buzzword. Could Lambda provide a way to deploy Python WSGI applications that helps reduce cost, complexity, and management overhead? First, let's consider what serverless really means.</p><h2>Introducing Lambda</h2><p>AWS Lambda is a cloud service that lets developers deploy and run code without provisioning or managing servers. Under the hood, there is of course still a server where code is run, but its existence is largely abstracted away. Lambda, and other services in the category, are likely better defined as \"functions as a service\" (FaaS).</p><p>Lambda provides <a href=\"https://docs.aws.amazon.com/lambda/latest/dg/current-supported-versions.html\">built-in Python support</a>, and <a href=\"https://docs.aws.amazon.com/lambda/latest/dg/invoking-lambda-functions.html\">invoking Lambda functions</a> can be done manually, or via an event triggered by an another AWS service, including <a href=\"https://docs.aws.amazon.com/lambda/latest/dg/invoking-lambda-function.html#supported-event-source-s3\">Amazon S3</a>, <a href=\"https://aws.amazon.com/dynamodb/\">Amazon DynamoDB</a>, and even <a href=\"https://docs.aws.amazon.com/lambda/latest/dg/invoking-lambda-function.html#supported-event-source-echo\">Amazon Alexa</a>, just to name a few.</p><p>Lambda functions can also be invoked via HTTP through the use of the <a href=\"https://docs.aws.amazon.com/lambda/latest/dg/invoking-lambda-function.html#supported-event-source-api-gateway\">Amazon API Gateway</a>, which opens up the possibility that WSGI applications could be exposed through Lambda. That said, the complexity of setting up a WSGI application to run within a Lambda execution environment is daunting.</p><h2>The Serverless Framework</h2><p>Enter the <a href=\"https://serverless.com\">Serverless Framework</a>, a toolkit for creating, managing, deploying, and operating serverless architectures. Serverless supports AWS Lambda, and other FaaS platforms, and makes the process of getting your code deployed to Lambda much easier. Serverless is written in JavaScript, and is easily installable through <code>npm</code>:</p><pre>$ npm install serverless -g</pre><p>Once installed, you can use the <code>serverless</code> tool from the command line to perform a whole host of tasks, such as creating new functions from templates, deploying functions to providers, and invoking functions directly.</p><h2>Serverless WSGI</h2><p>The <a href=\"https://github.com/logandk/serverless-wsgi\">serverless-wsgi</a> plugin for the Serverless Framework allows you to take any Python WSGI application, and deploy it to Lambda with ease. Let's take a look at how!</p><p>I've been working on a Python-based <a href=\"https://indieweb.org/IndieAuth\">IndieAuth</a> implementation called <a href=\"https://github.com/cleverdevil/punyauth\">PunyAuth</a> for a few weeks, and as an infrequently accessed web service, its a perfect candidate for a FaaS-backed deployment.</p><p>First, I installed the <code>serverless-wsgi</code> plugin:</p><pre>$ npm install serverless-wsgi -g</pre><p>Then, I created a file called <code>punywsgi.py</code> that exposes PunyAuth as a WSGI application:</p><pre>from pecan.deploy import deploy\napp = deploy('my-config.py')</pre><p>In order to bundle up PunyAuth and all of its dependencies, serverless-wsgi needs a <code>requirements.txt</code> file, which is easily done using <code>pip</code>:</p><pre>$ pip freeze > requirements.txt</pre><p>Finally, I created a <code>serverless.yml</code> file that defines the service:</p><pre>service: serverless-punyauth\n\nplugins:\n - serverless-wsgi\n\ncustom:\n wsgi:\n app: punywsgi.app\n\nprovider:\n name: aws\n runtime: python3.6\n region: us-east-1\n iamRoleStatements:\n - Effect: \"Allow\"\n Action:\n - s3:*\n Resource:\n - arn:aws:s3:::cleverdevil-punyauth-testing/*\n\nfunctions:\n app:\n handler: wsgi.handler\n events:\n - http: ANY /\n - http: 'ANY {proxy+}'</pre><p>The <code>serverless.yml</code> file declares a service called <code>serverless-punyauth</code>, enables the <code>serverless-wsgi</code> plugin, and directs it to expose the WSGI app defined in <code>punywsgi.app</code>. When using <code>serverless-wsgi</code>, the bundled <code>wsgi.handler</code> can automatically map requests and responses coming in through the Amazon API Gateway to the deployed WSGI app.</p><p>In the case of PunyAuth, the function itself needs read/write access to a particular AWS S3 bucket, which is accomplished here by defining an AWS IAM role that explicitly grants this access.</p><p>At this point, the application is ready to be deployed to AWS Lambda.</p><pre>$ serverless deploy\nServerless: Packaging Python WSGI handler...\nServerless: Packaging required Python packages...\nServerless: Linking required Python packages...\nServerless: Packaging service...\nServerless: Excluding development dependencies...\nServerless: Unlinking required Python packages...\nServerless: Uploading CloudFormation file to S3...\nServerless: Uploading artifacts...\nServerless: Uploading service .zip file to S3 (2.08 MB)...\nServerless: Validating template...\nServerless: Updating Stack...\nServerless: Checking Stack update progress...\n................\nServerless: Stack update finished...\nService Information\nservice: serverless-punyauth\nstage: dev\nregion: us-east-1\nstack: serverless-punyauth-dev\napi keys:\n None\nendpoints:\n ANY - <a href=\"https://rpmchol040.execute-api.us-east-1.amazonaws.com/dev\">https://rpmchol040.execute-api.us-east-1.amazonaws.com/dev</a>\n ANY - <a href=\"https://rpmchol040.execute-api.us-east-1.amazonaws.com/dev/%7Bproxy+%7D\">https://rpmchol040.execute-api.us-east-1.amazonaws.com/dev/{proxy+}</a>\nfunctions:\n app: serverless-punyauth-dev-app\nServerless: Removing old service versions...</pre><p>Tada! We've deployed PunyAuth as a Lambda function!</p><h2>Use Cases and Benefits</h2><p>Deploying WSGI applications on Lambda is certainly cool, but its also not appropriate for all use cases. Typically, WSGI applications are deployed in always-running WSGI servers. With Lambda, the behind-the-scenes server that represents the environment for your application is magically started and stopped on an as-needed basis, and the application itself will need to be loaded during the function invokation on-demand. This adds some additional overhead, so in the case of high-performance or frequently-accessed applications, you'll likely want to go another route.</p><p>That said, for applications like PunyAuth, where performance isn't super critical, and the application is accessed relatively infrequently, this approach has a multitude of benefits.</p><h3>Benefit: Cost</h3><p>Deploying a Python WSGI application the traditional way, with always-on infrastructure, will certainly result in higher performance, but also in significantly higher cost. With Lambda, you only pay for the actual execution time of your functions, rather than paying for, say, an EC2 instance that is always on. That means that hosting a low-traffic WSGI app in Lambda could cost you pennies a month.</p><h3>Benefit: Management</h3><p>While servers have certainly become easier to manage over the years, and managed hosting providers exist that will handle operating system updates and security patches, there's no question that deploying to Lambda will reduce your management overhead. The Lambda execution environment is entirely managed by Amazon, allowing you to focus on your application code, rather than on managing a fleet of servers.</p><h3>Benefit: Security</h3><p>With AWS handling the heavy lifting of keeping the execution environment up-to-date with security patches, and the ability to apply fine-grained controls using AWS IAM roles, keeping your application secure is a bit easier.</p><h2>Conclusion</h2><p>AWS Lambda and the Serverless Framework provide a whole new way to host Python WSGI applications that can help reduce cost, eliminate management, and improve security.</p>"
},
"author": {
"type": "card",
"name": "Jonathan LaCour",
"url": "https://cleverdevil.io/profile/cleverdevil",
"photo": "https://aperture-media.p3k.io/cleverdevil.io/abdf4969f052cb64177f73cda9be6a709931eb55607f8c1fb2c69eb135841acf.jpg"
},
"_id": "175873",
"_source": "71",
"_is_read": true
}