For real though, I’ve been thinking more and more about the stuff I do on the side and the usefulness of it. I half promised a competitor to a platform BY myself in less time (grossly overestimating the effort and care it takes to build these platforms) and now I sometimes feel like I’ve wasted two years of my life on this stuff. It’s not like I don’t believe in the open social Web, I just challenge my thoughts on its viability in the next decade as well as the active moves working to divest from things like it into more proprietary stacks.
One thing that trips me up a lot is how a lot of the stuff in the open social Web is stuck on text (barely anything around multimedia). Text seems easy but photos require a whole different level of care and thought that’s consistently an afterthought. I don’t see much nudging on it in the IndieWeb - some progress on handling them is made in mainline projects like Mastodon but again, it’s marginal. It’s not that people are playing to catch up - I don’t think this is an interest for most at all (and it’s a safe way to prevent any of the harm found in silos from leaking into our space) but I do think we should be aiming to ween people off these platforms. I see more and more asks for alternatives and I can’t even comfortably recommend most that meet feature parity.
I’m sad. And I’m hoping to use this as motivation to keep hammering on things.
{
"type": "entry",
"published": "2020-08-16T11:08:00.00000-07:00",
"url": "https://v2.jacky.wtf/post/5803f27d-50d5-49a5-a5a4-0a4142c79689",
"category": [
"open social web",
"itch",
"indieweb",
"thoughts"
],
"content": {
"text": "For real though, I\u2019ve been thinking more and more about the stuff I do on the side and the usefulness of it. I half promised a competitor to a platform BY myself in less time (grossly overestimating the effort and care it takes to build these platforms) and now I sometimes feel like I\u2019ve wasted two years of my life on this stuff. It\u2019s not like I don\u2019t believe in the open social Web, I just challenge my thoughts on its viability in the next decade as well as the active moves working to divest from things like it into more proprietary stacks.One thing that trips me up a lot is how a lot of the stuff in the open social Web is stuck on text (barely anything around multimedia). Text seems easy but photos require a whole different level of care and thought that\u2019s consistently an afterthought. I don\u2019t see much nudging on it in the IndieWeb - some progress on handling them is made in mainline projects like Mastodon but again, it\u2019s marginal. It\u2019s not that people are playing to catch up - I don\u2019t think this is an interest for most at all (and it\u2019s a safe way to prevent any of the harm found in silos from leaking into our space) but I do think we should be aiming to ween people off these platforms. I see more and more asks for alternatives and I can\u2019t even comfortably recommend most that meet feature parity.I\u2019m sad. And I\u2019m hoping to use this as motivation to keep hammering on things.",
"html": "<p>For real though, I\u2019ve been thinking more and more about the stuff I do on the side and the usefulness of it. I half promised a competitor to a platform BY myself in less time (grossly overestimating the effort and care it takes to build these platforms) and now I sometimes feel like I\u2019ve wasted two years of my life on this stuff. It\u2019s not like I don\u2019t believe in the open social Web, I just challenge my thoughts on its viability in the next decade as well as the active moves working to divest from things like it into more proprietary stacks.</p><p>One thing that trips me up a lot is how a lot of the stuff in the open social Web is stuck on text (barely anything around multimedia). Text seems easy but photos require a whole different level of care and thought that\u2019s consistently an afterthought. I don\u2019t see much nudging on it in the IndieWeb - some progress on handling them is made in mainline projects like Mastodon but again, it\u2019s marginal. It\u2019s not that people are playing to catch up - I don\u2019t think this is an interest for most at all (and it\u2019s a safe way to prevent any of the harm found in silos from leaking into our space) but I do think we should be aiming to ween people off these platforms. I see more and more asks for alternatives and I can\u2019t even comfortably recommend most that meet feature parity.</p><p>I\u2019m sad. And I\u2019m hoping to use this as motivation to keep hammering on things.</p>"
},
"author": {
"type": "card",
"name": "",
"url": "https://v2.jacky.wtf",
"photo": null
},
"post-type": "note",
"_id": "14056406",
"_source": "1886",
"_is_read": true
}
Replied to
Good to hear! I think generated flat files are the future. How much electricity is wasted by compiling php every page load? I'm looking at something similar called hugo atm. TBH no-one comments on blogs any more anyway :/ It'd be nice to keep the old ones though..
— Alex McLean (@yaxu) August 13, 2020
Hugo’s a good un but you might personally like Hakyll… jaspervdj.be/hakyll/
One nice way to get comments on a static site is webmentions – indieweb.org/Webmention
Also on: Twitter
{
"type": "entry",
"author": {
"name": "Neil Mather",
"url": "https://doubleloop.net/",
"photo": null
},
"url": "https://doubleloop.net/2020/08/15/static-question/",
"published": "2020-08-15T16:39:36+00:00",
"content": {
"html": "Replied to \n<blockquote><blockquote><p>Good to hear! I think generated flat files are the future. How much electricity is wasted by compiling php every page load? I'm looking at something similar called hugo atm. TBH no-one comments on blogs any more anyway :/ It'd be nice to keep the old ones though..</p>\u2014 Alex McLean (@yaxu) <a href=\"https://twitter.com/yaxu/status/1294040627441807360?ref_src=twsrc%5Etfw\">August 13, 2020</a></blockquote><a href=\"https://twitter.com/yaxu/status/1294040627441807360\"></a></blockquote>\n\nHugo\u2019s a good un but you might personally like Hakyll\u2026 <a href=\"https://jaspervdj.be/hakyll/\">jaspervdj.be/hakyll/</a>\n<p>One nice way to get comments on a static site is webmentions \u2013 <a href=\"https://indieweb.org/Webmention\">indieweb.org/Webmention</a></p>\nAlso on:<p><a href=\"https://twitter.com/loopdouble/status/1294675129356759040\"> Twitter</a></p>",
"text": "Replied to \nGood to hear! I think generated flat files are the future. How much electricity is wasted by compiling php every page load? I'm looking at something similar called hugo atm. TBH no-one comments on blogs any more anyway :/ It'd be nice to keep the old ones though..\u2014 Alex McLean (@yaxu) August 13, 2020\n\nHugo\u2019s a good un but you might personally like Hakyll\u2026 jaspervdj.be/hakyll/\nOne nice way to get comments on a static site is webmentions \u2013 indieweb.org/Webmention\nAlso on: Twitter"
},
"post-type": "note",
"_id": "14032947",
"_source": "1895",
"_is_read": true
}
Matt made this website to explain RSS to people who are as-ye unfamilar with it.
{
"type": "entry",
"published": "2020-08-14T15:03:20Z",
"url": "https://adactio.com/links/17296",
"category": [
"rss",
"feeds",
"syndication",
"explanation",
"explainer",
"subscribe",
"subscriptions",
"newsreaders",
"feedreaders",
"indieweb"
],
"bookmark-of": [
"https://aboutfeeds.com/"
],
"content": {
"text": "About Feeds | Getting Started guide to web feeds/RSS\n\n\n\nMatt made this website to explain RSS to people who are as-ye unfamilar with it.",
"html": "<h3>\n<a class=\"p-name u-bookmark-of\" href=\"https://aboutfeeds.com/\">\nAbout Feeds | Getting Started guide to web feeds/RSS\n</a>\n</h3>\n\n<p>Matt made this website to explain RSS to people who are as-ye unfamilar with it.</p>"
},
"author": {
"type": "card",
"name": "Jeremy Keith",
"url": "https://adactio.com/",
"photo": "https://adactio.com/images/photo-150.jpg"
},
"post-type": "bookmark",
"_id": "14008478",
"_source": "2",
"_is_read": true
}
{
"type": "entry",
"author": {
"name": "Kh\u00fcrt",
"url": "https://islandinthenet.com/",
"photo": null
},
"url": "https://islandinthenet.com/my-ban-on-eu-website-traffic-has-been-lifted/",
"published": "2018-05-17T22:08:40+00:00",
"content": {
"html": "<p>Due to concerns about my <strong>legal</strong> responsibilities around compliance the European Union General Data Protection Regulations, I configured my Wordfence web application firewall (WAF) to block all traffic origination in EU member countries. While some people think this was an extreme move, a lack of clarity around what is expected of small website operators and that I operate an information technology related consultancy, left me feeling vulnerable. Until I could understand what/if I needed to do to comply with GDPR\u2019s \u201cright to be forgotten\u201d, I simply did not want the risk.<br />Today, I have removed the WAF rules that restrict traffic originating in the EU. <a href=\"https://automattic.com/\">Automattic</a>, the company behind WordPress.com and the supporters of WordPress.org, have updated/are <a href=\"https://automattic.com/privacy/\">updating</a> JetPack and other properties to comply with the GDPR. Currently, my self-hosted WordPress uses the Jetpack plug-in to handle things like comments and website traffic analysis. This moves some of the risks off to Automattic. They will be the data controller for information collected via comments and website analytics.<br />Automattic has provided information on what <a href=\"https://jetpack.com/support/markdown/#privacy\">information JetPack collects for comments</a> and how that data is used. They have done the same for <a href=\"https://jetpack.com/support/wordpress-com-stats/#privacy\">website analytics</a>. Click on those links to find out more.<br />I have added \u201cDo Not Track\u201d <a href=\"https://jetpack.com/support/wordpress-com-stats/#honoring-dnt\">code</a> to my WordPress config via JetPack. According to <a href=\"https://jetpack.com/support/wordpress-com-stats/#data-visibility-and-retention\">Automattic</a>.</p>\n<blockquote><p>\n Any piece of data explicitly identifying a specific user (IP address, WordPress.com ID, WordPress.com username, etc.) is not visible to the site owner when using this feature. For example, a site owner can see that a specific post has 285 views, but he/she cannot see which specific users/accounts viewed that post.<br />Stats logs \u2014 containing visitor IP addresses and WordPress.com usernames (if available) \u2014 are retained by Automattic for 28 days and are used only for the purpose of powering this feature.\n</p></blockquote>\n<p>Comments on my blog will be restricted to what JetPack and Webmentions provide. I expect that JetPack comments will soon have the ability for commenters to delete comments, allowing compliance with GDPR requirements. I expect that people using Webmentions understand how they work and understand that they can delete a comment by sending another Webmention to do so.<br />I do not intend to collect any information on visitors or commenters to this website other than what JetPacks collects.<br />I am basing my decision to remove the WAF rules based on the changes that Automattic is making and also on guidance in this <a href=\"https://www.codeinwp.com/blog/complete-wordpress-gdpr-guide/\">codeinwp.blog</a> post. Also, Wordfence has applied \u201cfor the <a href=\"https://www.wordfence.com/blog/2018/05/wordfence-gdpr-compliance-update-2/\">Privacy Shield</a> certification program for both EU-US and Swiss-US and will soon have available a Data Processing Agreement\u201d for EU customers who need one.<br />I guess what\u2019s really pissing me off is that although I live in the United States of American, some fucking European law can reach across the ocean and potentially affect me. That, that pisses me off!!<br /><img src=\"http://142.93.124.147/wp-content/uploads/2018/05/Screen-Shot-2018-05-17-at-6.12.47-PM.jpg\" alt=\"\" />Geography of the visitors to Island in the Net.</p>",
"text": "Due to concerns about my legal responsibilities around compliance the European Union General Data Protection Regulations, I configured my Wordfence web application firewall (WAF) to block all traffic origination in EU member countries. While some people think this was an extreme move, a lack of clarity around what is expected of small website operators and that I operate an information technology related consultancy, left me feeling vulnerable. Until I could understand what/if I needed to do to comply with GDPR\u2019s \u201cright to be forgotten\u201d, I simply did not want the risk.\nToday, I have removed the WAF rules that restrict traffic originating in the EU. Automattic, the company behind WordPress.com and the supporters of WordPress.org, have updated/are updating JetPack and other properties to comply with the GDPR. Currently, my self-hosted WordPress uses the Jetpack plug-in to handle things like comments and website traffic analysis. This moves some of the risks off to Automattic. They will be the data controller for information collected via comments and website analytics.\nAutomattic has provided information on what information JetPack collects for comments and how that data is used. They have done the same for website analytics. Click on those links to find out more.\nI have added \u201cDo Not Track\u201d code to my WordPress config via JetPack. According to Automattic.\n\n Any piece of data explicitly identifying a specific user (IP address, WordPress.com ID, WordPress.com username, etc.) is not visible to the site owner when using this feature. For example, a site owner can see that a specific post has 285 views, but he/she cannot see which specific users/accounts viewed that post.\nStats logs \u2014 containing visitor IP addresses and WordPress.com usernames (if available) \u2014 are retained by Automattic for 28 days and are used only for the purpose of powering this feature.\n\nComments on my blog will be restricted to what JetPack and Webmentions provide. I expect that JetPack comments will soon have the ability for commenters to delete comments, allowing compliance with GDPR requirements. I expect that people using Webmentions understand how they work and understand that they can delete a comment by sending another Webmention to do so.\nI do not intend to collect any information on visitors or commenters to this website other than what JetPacks collects.\nI am basing my decision to remove the WAF rules based on the changes that Automattic is making and also on guidance in this codeinwp.blog post. Also, Wordfence has applied \u201cfor the Privacy Shield certification program for both EU-US and Swiss-US and will soon have available a Data Processing Agreement\u201d for EU customers who need one.\nI guess what\u2019s really pissing me off is that although I live in the United States of American, some fucking European law can reach across the ocean and potentially affect me. That, that pisses me off!!\nGeography of the visitors to Island in the Net."
},
"name": "My ban on EU website traffic has been lifted.",
"post-type": "article",
"_id": "13987305",
"_source": "242",
"_is_read": true
}
{
"type": "entry",
"author": {
"name": "Kh\u00fcrt",
"url": "https://islandinthenet.com/",
"photo": null
},
"url": "https://islandinthenet.com/keyring-social-importers-wordpress/",
"published": "2018-01-01T19:12:52+00:00",
"content": {
"html": "<p>[exif id=\u201d27124\u2033]<br />Update: Developer, <a href=\"https://dentedreality.com.au/about/\">Bewu Lebens</a> has written a <a href=\"https://dentedreality.com.au/projects/wp-keyring/\">WordPress: Keyring Developer\u2019s Guide</a>. I will revisit how I used this plugin after I familiarise myself with the guide. I also plan on developing a Keyring importer for Untappd.<br />On Twitter, Xavier Roy posted a <a href=\"https://twitter.com/xavierroy/status/944186489725796352\">response</a> to my <a href=\"http://104.236.229.226/instagram-wordpress/\">post</a> about my experiment with importing Instagram post to WordPress. He suggested I try the <a href=\"https://wordpress.org/plugins/keyring-social-importers/\">Keyring Social Importers</a> plugin. After some back and forth on Twitter, I gave it a go and updated the original post.<br />The results were not pleasing. However, I was fascinated by the plugin and tried it with Foursquare.<br />The Keyring Social Importers plugin provides a set of social importers that pull in content created on other sites and re-publishes it on a WordPress site. After an initial import, the importers optionally check each hour and automatically download new content. New posts are created for each item imported with support for specific Post Formats, depending on the content type.<br />At the time of writing this, the plugin had not been tested on WordPress 4.9. The untested code can be unstable code and unstable code can lead to security leaks etc. so I configured and tested everything on a test instance of WordPress.<br />I also installed Keyring, a plugin which is required to use with Keyring Social Importer, and which provides the authentication and API connections to each of the external services. I configured API access to Foursquare and then from the \u201cTools->Import\u201d section of the dashboard I clicked the link to authenticate and to start importing from Foursquare.<br />I\u2019ve had my Foursquare account for a very long time so the import took a lot of time. Each check and photo on my Foursquare account was downloaded and imported into my WordPress Media Library. The imported image was attached to each post as a featured image, which is a feature I wanted, however, each image was also attached in the body of every post and is only 640px \u00d7 479px.<br />The importer leverages the Post Kinds and Simple Location IndieWeb plugins to set the Post Kind to Photo and set the geographic location for the post. However, syndication links were not set. But it\u2019s not too much work to set it manually later.<br />The Foursquare importer works but the results were not attractive. First, I did not want to import every one of my past check-ins. I wanted to limit the initial set of imports. Second, I wanted the higher resolution version of the image.<br />For example, for check-in at <a href=\"https://foursquare.com/khurtwilliams/checkin/5a496adec0f1632dde4f9774\">Osteria Procaccini</a>, the Keyring Social Importer imports this image.<br />I poked around the web and learned that it\u2019s possible to get at this image via the <a href=\"https://developer.foursquare.com/docs/api/users/checkins\">Foursquare API</a>. I discovered that by modifying the image URL and replacing the image dimension with the text <em>original</em>, I could get at the full-size images. But I also discovered that I could get what I wanted by removing some HTML.<br /><img src=\"https://i0.wp.com/igx.4sqi.net/img/general/640x640/108856_El7kZnp5nO4EESFJRB4KmxfXSfIcn_mbceLEOylUfdM.jpg?w=840&ssl=1\" alt=\"\" /><br />What I needed to do was to find the section of code that imports images, and change the code to import the higher-resolution image.<br />In the <code>wp-content/plugins/keyring-social-importers/keyring-importers.php</code> file, I found the following code between lines 865 to 876. If I removed that code, I would have everything I wanted.</p>\n<pre><code><br />if ( $data ) {\n$img = \u2018<img class=\"keyring-img\" src=\"' . esc_url( $data[0] ) . '\" alt=\"' . esc_attr( $post['post_title'] ) . '\" width=\"' . esc_attr( $data[1] ) . '\" height=\"' . esc_attr( $data[2] ) . '\" />\u2019;\n}\n// Regex out the previous img tag, put this one in there instead, or prepend it to the top/bottom, depending on $append\nif ( stristr( $post['post_content'], $url ) ) {\n$post['post_content'] = preg_replace( '!<img />]*src=[\\'\"]' . preg_quote( $url ) . '[\\'\"][^>]*>!', $img, $post['post_content'] ) . \"\\n\";\n} else if ( $append ) {\n$post['post_content'] = $post['post_content'] . \"\\n\\n\" . $img;\n} else {\n$post['post_content'] = $img . \"\\n\\n\" . $post['post_content'];\n}\n}\n</code></pre>\n<p><img src=\"https://i0.wp.com/igx.4sqi.net/img/general/original/108856_El7kZnp5nO4EESFJRB4KmxfXSfIcn_mbceLEOylUfdM.jpg?w=840&ssl=1\" alt=\"\" /><br />The default is for Keyring Social Imports to import 200 posts in oldest to newest order. I didn\u2019t want every post and I wanted only one newest post. A modification to line 74 of the <code>wp-content/plugins/keyring-social-importers/importers/keyring-importer-foursquare.php</code> file yielded the results I wanted.<br />Before:<br /><code>$url = \"https://api.foursquare.com/v2/users/\" . $this-&gt;get_option( 'user_id', 'self' ) . \"/checkins?limit=200\";</code><br />After:<br /><code>$url = \"https://api.foursquare.com/v2/users/\" . $this-&gt;get_option( 'user_id', 'self' ) . \"/checkins?limit=2&amp;sort=newestfirst\";</code><br />I tested the code over the last few weeks and so far I have not encountered any issues. I think with these modifications to the code, the Keyring Social Importers plugin is a reliable <a href=\"https://indieweb.org/WordPress/Plugins#PESOS_Plugins\">PESOS WordPress plugin</a> for importing Foursquare check-ins.</p>",
"text": "[exif id=\u201d27124\u2033]\nUpdate: Developer, Bewu Lebens has written a WordPress: Keyring Developer\u2019s Guide. I will revisit how I used this plugin after I familiarise myself with the guide. I also plan on developing a Keyring importer for Untappd.\nOn Twitter, Xavier Roy posted a response to my post about my experiment with importing Instagram post to WordPress. He suggested I try the Keyring Social Importers plugin. After some back and forth on Twitter, I gave it a go and updated the original post.\nThe results were not pleasing. However, I was fascinated by the plugin and tried it with Foursquare.\nThe Keyring Social Importers plugin provides a set of social importers that pull in content created on other sites and re-publishes it on a WordPress site. After an initial import, the importers optionally check each hour and automatically download new content. New posts are created for each item imported with support for specific Post Formats, depending on the content type.\nAt the time of writing this, the plugin had not been tested on WordPress 4.9. The untested code can be unstable code and unstable code can lead to security leaks etc. so I configured and tested everything on a test instance of WordPress.\nI also installed Keyring, a plugin which is required to use with Keyring Social Importer, and which provides the authentication and API connections to each of the external services. I configured API access to Foursquare and then from the \u201cTools->Import\u201d section of the dashboard I clicked the link to authenticate and to start importing from Foursquare.\nI\u2019ve had my Foursquare account for a very long time so the import took a lot of time. Each check and photo on my Foursquare account was downloaded and imported into my WordPress Media Library. The imported image was attached to each post as a featured image, which is a feature I wanted, however, each image was also attached in the body of every post and is only 640px \u00d7 479px.\nThe importer leverages the Post Kinds and Simple Location IndieWeb plugins to set the Post Kind to Photo and set the geographic location for the post. However, syndication links were not set. But it\u2019s not too much work to set it manually later.\nThe Foursquare importer works but the results were not attractive. First, I did not want to import every one of my past check-ins. I wanted to limit the initial set of imports. Second, I wanted the higher resolution version of the image.\nFor example, for check-in at Osteria Procaccini, the Keyring Social Importer imports this image.\nI poked around the web and learned that it\u2019s possible to get at this image via the Foursquare API. I discovered that by modifying the image URL and replacing the image dimension with the text original, I could get at the full-size images. But I also discovered that I could get what I wanted by removing some HTML.\n\nWhat I needed to do was to find the section of code that imports images, and change the code to import the higher-resolution image.\nIn the wp-content/plugins/keyring-social-importers/keyring-importers.php file, I found the following code between lines 865 to 876. If I removed that code, I would have everything I wanted.\n\nif ( $data ) {\n$img = \u2018<img class=\"keyring-img\" src=\"' . esc_url( $data[0] ) . '\" alt=\"' . esc_attr( $post['post_title'] ) . '\" width=\"' . esc_attr( $data[1] ) . '\" height=\"' . esc_attr( $data[2] ) . '\" />\u2019;\n}\n// Regex out the previous img tag, put this one in there instead, or prepend it to the top/bottom, depending on $append\nif ( stristr( $post['post_content'], $url ) ) {\n$post['post_content'] = preg_replace( '!<img />]*src=[\\'\"]' . preg_quote( $url ) . '[\\'\"][^>]*>!', $img, $post['post_content'] ) . \"\\n\";\n} else if ( $append ) {\n$post['post_content'] = $post['post_content'] . \"\\n\\n\" . $img;\n} else {\n$post['post_content'] = $img . \"\\n\\n\" . $post['post_content'];\n}\n}\n\n\nThe default is for Keyring Social Imports to import 200 posts in oldest to newest order. I didn\u2019t want every post and I wanted only one newest post. A modification to line 74 of the wp-content/plugins/keyring-social-importers/importers/keyring-importer-foursquare.php file yielded the results I wanted.\nBefore:\n$url = \"https://api.foursquare.com/v2/users/\" . $this->get_option( 'user_id', 'self' ) . \"/checkins?limit=200\";\nAfter:\n$url = \"https://api.foursquare.com/v2/users/\" . $this->get_option( 'user_id', 'self' ) . \"/checkins?limit=2&sort=newestfirst\";\nI tested the code over the last few weeks and so far I have not encountered any issues. I think with these modifications to the code, the Keyring Social Importers plugin is a reliable PESOS WordPress plugin for importing Foursquare check-ins."
},
"name": "Experiment with FourSquare, WordPress and the Keyring Social Importers plugin",
"post-type": "article",
"_id": "13986742",
"_source": "242",
"_is_read": true
}
Making an #IndieAuth authorization endpoint. #indieweb #webdesign
{
"type": "entry",
"published": "2020-08-13T15:37:04+00:00",
"url": "https://fireburn.ru/posts/1597333024",
"category": [
"IndieAuth",
"IndieWeb",
"webdesign"
],
"photo": [
"https://fireburn.ru/media/3f/c6/36/49/3198a77da34b0e94e1b32e0856812c32e2a1eadb691c33aa9eca3d48.png",
"https://fireburn.ru/media/4d/6c/c1/e7/bf15fc923ce566fc26827682d3a0f683af1aa0c5ed6270deaa9ab4e8.png"
],
"syndication": [
"https://twitter.com/kisik21/status/1293934634796036097"
],
"content": {
"text": "Making an #IndieAuth authorization endpoint. #indieweb #webdesign",
"html": "<p>Making an #IndieAuth authorization endpoint. #indieweb #webdesign</p>"
},
"author": {
"type": "card",
"name": "Vika",
"url": "https://fireburn.ru/",
"photo": "https://fireburn.ru/media/f1/5a/fb/9b/081efafb97b4ad59f5025cf2fd0678b8f3e20e4c292489107d52be09.png"
},
"post-type": "photo",
"_id": "13985873",
"_source": "1371",
"_is_read": true
}
{
"type": "entry",
"published": "2020-08-11T20:52:00+01:00",
"url": "https://www.jvt.me/mf2/2020/08/uilcf/",
"category": [
"indieauth"
],
"bookmark-of": [
"http://beesbuzz.biz/blog/6265-Two-PSAs-regarding-IndieAuth"
],
"author": {
"type": "card",
"name": "Jamie Tanna",
"url": "https://www.jvt.me",
"photo": "https://www.jvt.me/img/profile.png"
},
"post-type": "bookmark",
"_id": "13936090",
"_source": "2169",
"_is_read": true
}
{
"type": "entry",
"author": {
"name": "fluffy",
"url": "http://beesbuzz.biz/",
"photo": null
},
"url": "http://beesbuzz.biz/blog/6265-Two-PSAs-regarding-IndieAuth",
"published": "2020-08-11T11:32:07-07:00",
"content": {
"html": "<p><a href=\"https://indieauth.net/\">IndieAuth</a> is starting to get some traction in the greater Internet space, which is really cool! I\u2019m glad to see a protocol finally emerging around distributed/federated identity, managing to get some traction where OpenID more or less failed (despite a few hangers-on still supporting it).</p><p>There are two issues that implementers of IndieAuth clients (i.e. websites which use IndieAuth for authentication) and endpoints (i.e. the things which do the actual authentication) should be aware of.</p>\n\n\n<h2><a href=\"http://beesbuzz.biz/blog/6265-Two-PSAs-regarding-IndieAuth#6265_h2_1_Recent-spec-change\"></a>Recent spec change</h2><p>There was a <a href=\"https://github.com/indieweb/indieauth/issues/42\">recent change to the specification</a> which leads to some incompatibility around scopeless identity verification. This is a flow that\u2019s used for simple sign-on to websites; for example it\u2019s what <a href=\"https://github.com/PlaidWeb/Authl\">Authl</a> uses, as it\u2019s not requesting permission to the user\u2019s resources, but just associating the user with an identity. This is a pretty common use case for IndieAuth.</p><p>In the previous version of the spec, the scopeless identity verification used <code>response_type=id</code> and didn\u2019t specify a <code>scope</code>; these two options were mutually-exclusive, and this led to a somewhat confusing spec. The latest spec update changes this flow to use <code>response_type=code</code> and to simply omit the <code>scope</code>. This is a much better protocol in general, <em>but</em> it has the problem of being fundamentally incompatible with older endpoints which do strict parameter checking.</p><p>Unfortunately, the current spec doesn\u2019t document the legacy <code>response_type=id</code> as a backwards-compatibility path, so any newly-written endpoints will not support older clients. Newer clients will attempt to send a scopeless <code>response_type=code</code> request which will fail on older endpoints.</p><p>The nature of the IndieAuth protocol also means that there\u2019s no way for the client to know which protocol version is in use or to get notified that there was a failure, so the end result will be a bad user experience (such as their login page returning an opaque error about \u201cmissing scope\u201d or \u201cinvalid response type\u201d or whatever).</p><p>So, some suggested mitigations:</p>\n<ul><li>Endpoint authors: Accept <code>response_type=id</code> with no <code>scope</code> parameter, possibly displaying a deprecation warning to the user (asking that they reach out to the client website to update to the latest version of the specification)</li>\n<li>Client authors: Always send <code>response_type=code</code> and request a meaningful but non-invasive-sounding scope, such as <code>me</code> or <code>profile</code> or whatever. (This is what Authl will be doing as of the next release.)</li>\n</ul><p>The actual scope for clients to request is very much up for debate. Here\u2019s a few different things I\u2019ve seen proposed:</p>\n<ul><li><code>me</code>: The most straightforward, in keeping with IndieAuth nomenclature. Literally all you\u2019re asking for is the <code>me</code> URL, which is already a basic part of the verification response. But this is kind of confusing to end users.</li>\n<li><code>profile</code>: This seems user-friendly, but there\u2019s the <a href=\"https://github.com/indieweb/indieauth/issues/41\">possibility</a> that future IndieAuth endpoints might use this as a cue to send the profile directly rather than making it a separate retrieval of <code>h-card</code> data from the identity URL. There\u2019s reasons to do this, so I\u2019d much rather reserve this scope for when IndieAuth supports it natively. On the other hand, it\u2019d also be nice to just automatically start receiving profile data when endpoints start supporting that.</li>\n<li><code>email</code>: This implies that someone will be able to do something unspecified with the user\u2019s email. It\u2019s also not actually an email address that is being requested in the first place.</li>\n<li><code>read</code>: This implies being able to read private data. Keep this scope for things like micropub!</li>\n</ul><p>So for me the decision is either <code>me</code> or <code>profile</code> but both have caveats. I\u2019ll probably go with <code>profile</code> since it seems like the least-confusing thing (at least in the context of Authl), but per the IndieAuth spec you <em>should</em> be able to use any scope you want. But there effectively has to be a scope, at least until every endpoint on the Internet has been updated or at least a reasonable amount of time has passed that all endpoints \u201cshould\u201d have been updated.</p><h2><a href=\"http://beesbuzz.biz/blog/6265-Two-PSAs-regarding-IndieAuth#6265_h2_2_Identity-verification-gap\"></a>Identity verification gap</h2><p>IndieAuth is designed under the assumption that everyone has their own domain name, which is an admirable goal but has a few problems for a lot of people, who don\u2019t have the technical acumen necessary to register and/or host their own domain but do have access to shared-domain hosting (such as <a href=\"https://tilde.club/\">tilde.club</a> or their university-provided academic homepage). However, there\u2019s nothing in the spec that actually requires that someone own their domain, and it\u2019s quite possible for someone to declare an endpoint on such a webpage, with no way for someone to verify that the webpage is authoritative for the entire domain (after all, it\u2019s just a <code><link></code> tag or a response header).</p><p>Where this becomes a security concern is that since the IndieAuth spec doesn\u2019t require proof of ownership of the domain, it considers any authentication against any page on a domain to be authoritative for the entire domain. As a proof of concept, you can use <a href=\"https://tilde.club/~fluffy/\">tilde.club/~fluffy/</a> as an identity URL that will let you verify as <em>any</em> identity on tilde.club, at least for things that follow the IndieAuth spec. (This is one place where Authl diverges from the specification, because I needed a better identity guarantee for my use case.)</p><p>There are two mitigations I have proposed for client authors:</p>\n<ol><li><p>Require that the final verified URL be more specific, path-wise, than the original identity request; for example, allow <code>https://example.com/</code> to verify as <code>https://example.com/~alice/</code>, but don\u2019t allow <code>https://example.com/~alice/</code> to verify as <code>https://example.com/~bob/</code></p><p>This is what Authl does at present. Codifying this in spec-friendly language is difficult, however, and there\u2019s a bunch of sharp edges you have to watch out for (such as not allowing <code>https://example.com/~alice</code> to verify as <code>https://example.com/~alice_sucks</code> or whatever; \u201cpath components\u201d aren\u2019t really that strong of a thing in HTTP <em>per se</em>).</p></li>\n<li><p>Require that the final identity URL declares the same <code>authorization_endpoint</code> as the verified URL</p><p>This approach seems preferable, as it is much easier to clearly and concisely explain. It also allows for more exotic situations where multiple shared-domain users might want to be able to authenticate as each other for whatever reason (for example, I can see people who identify as plural systems making use of this).</p></li>\n</ol><p>Both of these approaches have been discussed somewhat on <a href=\"https://github.com/indieweb/indieauth/issues/35\">this spec issue</a> as well as on IndieWeb Chat, and the general feeling I\u2019ve gotten is that approach 2 is greatly preferable to everyone. So in the future I will probably change Authl to use approach 2.</p>\n\n<p><a href=\"http://beesbuzz.biz/blog/6265-Two-PSAs-regarding-IndieAuth#comments\">comments</a></p>",
"text": "IndieAuth is starting to get some traction in the greater Internet space, which is really cool! I\u2019m glad to see a protocol finally emerging around distributed/federated identity, managing to get some traction where OpenID more or less failed (despite a few hangers-on still supporting it).There are two issues that implementers of IndieAuth clients (i.e. websites which use IndieAuth for authentication) and endpoints (i.e. the things which do the actual authentication) should be aware of.\n\n\nRecent spec changeThere was a recent change to the specification which leads to some incompatibility around scopeless identity verification. This is a flow that\u2019s used for simple sign-on to websites; for example it\u2019s what Authl uses, as it\u2019s not requesting permission to the user\u2019s resources, but just associating the user with an identity. This is a pretty common use case for IndieAuth.In the previous version of the spec, the scopeless identity verification used response_type=id and didn\u2019t specify a scope; these two options were mutually-exclusive, and this led to a somewhat confusing spec. The latest spec update changes this flow to use response_type=code and to simply omit the scope. This is a much better protocol in general, but it has the problem of being fundamentally incompatible with older endpoints which do strict parameter checking.Unfortunately, the current spec doesn\u2019t document the legacy response_type=id as a backwards-compatibility path, so any newly-written endpoints will not support older clients. Newer clients will attempt to send a scopeless response_type=code request which will fail on older endpoints.The nature of the IndieAuth protocol also means that there\u2019s no way for the client to know which protocol version is in use or to get notified that there was a failure, so the end result will be a bad user experience (such as their login page returning an opaque error about \u201cmissing scope\u201d or \u201cinvalid response type\u201d or whatever).So, some suggested mitigations:\nEndpoint authors: Accept response_type=id with no scope parameter, possibly displaying a deprecation warning to the user (asking that they reach out to the client website to update to the latest version of the specification)\nClient authors: Always send response_type=code and request a meaningful but non-invasive-sounding scope, such as me or profile or whatever. (This is what Authl will be doing as of the next release.)\nThe actual scope for clients to request is very much up for debate. Here\u2019s a few different things I\u2019ve seen proposed:\nme: The most straightforward, in keeping with IndieAuth nomenclature. Literally all you\u2019re asking for is the me URL, which is already a basic part of the verification response. But this is kind of confusing to end users.\nprofile: This seems user-friendly, but there\u2019s the possibility that future IndieAuth endpoints might use this as a cue to send the profile directly rather than making it a separate retrieval of h-card data from the identity URL. There\u2019s reasons to do this, so I\u2019d much rather reserve this scope for when IndieAuth supports it natively. On the other hand, it\u2019d also be nice to just automatically start receiving profile data when endpoints start supporting that.\nemail: This implies that someone will be able to do something unspecified with the user\u2019s email. It\u2019s also not actually an email address that is being requested in the first place.\nread: This implies being able to read private data. Keep this scope for things like micropub!\nSo for me the decision is either me or profile but both have caveats. I\u2019ll probably go with profile since it seems like the least-confusing thing (at least in the context of Authl), but per the IndieAuth spec you should be able to use any scope you want. But there effectively has to be a scope, at least until every endpoint on the Internet has been updated or at least a reasonable amount of time has passed that all endpoints \u201cshould\u201d have been updated.Identity verification gapIndieAuth is designed under the assumption that everyone has their own domain name, which is an admirable goal but has a few problems for a lot of people, who don\u2019t have the technical acumen necessary to register and/or host their own domain but do have access to shared-domain hosting (such as tilde.club or their university-provided academic homepage). However, there\u2019s nothing in the spec that actually requires that someone own their domain, and it\u2019s quite possible for someone to declare an endpoint on such a webpage, with no way for someone to verify that the webpage is authoritative for the entire domain (after all, it\u2019s just a <link> tag or a response header).Where this becomes a security concern is that since the IndieAuth spec doesn\u2019t require proof of ownership of the domain, it considers any authentication against any page on a domain to be authoritative for the entire domain. As a proof of concept, you can use tilde.club/~fluffy/ as an identity URL that will let you verify as any identity on tilde.club, at least for things that follow the IndieAuth spec. (This is one place where Authl diverges from the specification, because I needed a better identity guarantee for my use case.)There are two mitigations I have proposed for client authors:\nRequire that the final verified URL be more specific, path-wise, than the original identity request; for example, allow https://example.com/ to verify as https://example.com/~alice/, but don\u2019t allow https://example.com/~alice/ to verify as https://example.com/~bob/This is what Authl does at present. Codifying this in spec-friendly language is difficult, however, and there\u2019s a bunch of sharp edges you have to watch out for (such as not allowing https://example.com/~alice to verify as https://example.com/~alice_sucks or whatever; \u201cpath components\u201d aren\u2019t really that strong of a thing in HTTP per se).\nRequire that the final identity URL declares the same authorization_endpoint as the verified URLThis approach seems preferable, as it is much easier to clearly and concisely explain. It also allows for more exotic situations where multiple shared-domain users might want to be able to authenticate as each other for whatever reason (for example, I can see people who identify as plural systems making use of this).\nBoth of these approaches have been discussed somewhat on this spec issue as well as on IndieWeb Chat, and the general feeling I\u2019ve gotten is that approach 2 is greatly preferable to everyone. So in the future I will probably change Authl to use approach 2.\n\ncomments"
},
"name": "Plaidophile: Two PSAs regarding IndieAuth",
"post-type": "article",
"_id": "13933853",
"_source": "3782",
"_is_read": true
}
This bit around compiling and rendering sounds specific to the problems with using static sites. If it’s done completely remotely, to the level that something like Disqus or Commento could handle it, it’d just handle a fraction of what Webmentions are capable of. This, of course, requires time and effort to build. And since there isn’t a huge VC company backing such a notion, we won’t see it anytime soon.
{
"type": "entry",
"published": "2020-08-11T09:55:09.43979-07:00",
"url": "https://v2.jacky.wtf/post/cc7c38d9-4688-4384-806c-8c1bd3cb0a3c",
"in-reply-to": [
"https://lobste.rs/s/2kb9qt/what_we_talk_about_when_we_re_talking_about#c_trowmt"
],
"content": {
"text": "This bit around compiling and rendering sounds specific to the problems with using static sites. If it\u2019s done completely remotely, to the level that something like Disqus or Commento could handle it, it\u2019d just handle a fraction of what Webmentions are capable of. This, of course, requires time and effort to build. And since there isn\u2019t a huge VC company backing such a notion, we won\u2019t see it anytime soon.",
"html": "<p>This bit around compiling and rendering sounds specific to the problems with using static sites. If it\u2019s done completely remotely, to the level that something like Disqus or Commento could handle it, it\u2019d just handle a fraction of what Webmentions are capable of. This, of course, requires time and effort to build. And since there isn\u2019t a huge VC company backing such a notion, we won\u2019t see it anytime soon.</p>"
},
"author": {
"type": "card",
"name": "",
"url": "https://v2.jacky.wtf",
"photo": null
},
"post-type": "reply",
"refs": {
"https://lobste.rs/s/2kb9qt/what_we_talk_about_when_we_re_talking_about#c_trowmt": {
"type": "entry",
"url": "https://lobste.rs/s/2kb9qt/what_we_talk_about_when_we_re_talking_about#c_trowmt",
"author": {
"type": "card",
"name": "",
"url": "https://v2.jacky.wtf/stream",
"photo": null
},
"post-type": "note"
}
},
"_id": "13930737",
"_source": "1886",
"_is_read": true
}
M.A. Matienzo
Over and in response to the last few months, I’ve been reflecting about intentionality, and how I spend my time creating things. I have tried to improve the indiewebbiness of my site, and understanding what it means to “scratch my own itch”. This resonates particularly lately because it’s leading me to mull over which parts should be hard and easy. Unsurprisingly, much of that is personal preference, and figuring out how I want to optimize from the perspective of user experience. Friction in UX can be a powerful tool, part of what I’m trying to find is where I want to retain friction as it helps me remain intentional.
{
"type": "entry",
"published": "2020-08-10T21:21:51-0700",
"summary": "Over and in response to the last few months, I\u2019ve been reflecting about intentionality, and how I spend my time creating things. I have tried to improve the indiewebbiness of my site, and understanding what it means to \u201cscratch my own itch\u201d. This resonates particularly lately because it\u2019s leading me to mull over which parts should be hard and easy. Unsurprisingly, much of that is personal preference, and figuring out how I want to optimize from the perspective of user experience. Friction in UX can be a powerful tool, part of what I\u2019m trying to find is where I want to retain friction as it helps me remain intentional.",
"url": "https://matienzo.org/2020/optimizing-friction/",
"category": [
"indieweb",
"music",
"plan9",
"food"
],
"syndication": [
"https://twitter.com/anarchivist/status/1293042174972563456",
"https://chaos.social/@anarchivist/104668872964686387",
"https://news.indieweb.org/en/matienzo.org/2020/optimizing-friction/"
],
"name": "Optimizing friction",
"author": {
"type": "card",
"name": "M.A. Matienzo",
"url": false,
"photo": "https://matienzo.org/images/matienzo.jpg"
},
"post-type": "article",
"_id": "33379119",
"_source": "7223",
"_is_read": true
}
{
"type": "entry",
"author": {
"name": "Neil Mather",
"url": "https://doubleloop.net/",
"photo": null
},
"url": "https://doubleloop.net/2020/08/10/a-tapestry-of-humanity-and-what-we-know/",
"published": "2020-08-10T22:10:04+00:00",
"content": {
"html": "<blockquote><p>The World Wide Web was all about you create knowledge and how you create hyperlinks to different pieces of knowledge. So over time we kind of create this tapestry of humanity and what we know, which at another level of abstraction is kind of creating this meta-map of humanity. This meta-map of humanity is now controlled by these companies that are the ones who have access to the backend.</p>\n<p>\u2013 <a href=\"https://www.buzzsprout.com/1004689/4587590-the-global-south-holds-a-better-future-of-tech-w-juan-ortiz-freuler\">The Global South Holds a Better Future of Tech w/ Juan Ortiz Freuler</a></p></blockquote>\n<p>I love the phrase tapestry of humanity here. I like the <a href=\"https://commonplace.doubleloop.net/indieweb.html\">IndieWeb</a> and <a href=\"https://commonplace.doubleloop.net/wikis.html\">personal wikis</a> as a way of being part of the warp and weft of that tapestry.</p>",
"text": "The World Wide Web was all about you create knowledge and how you create hyperlinks to different pieces of knowledge. So over time we kind of create this tapestry of humanity and what we know, which at another level of abstraction is kind of creating this meta-map of humanity. This meta-map of humanity is now controlled by these companies that are the ones who have access to the backend.\n\u2013 The Global South Holds a Better Future of Tech w/ Juan Ortiz Freuler\nI love the phrase tapestry of humanity here. I like the IndieWeb and personal wikis as a way of being part of the warp and weft of that tapestry."
},
"name": "A tapestry of humanity and what we know",
"post-type": "article",
"_id": "13911719",
"_source": "1895",
"_is_read": true
}
I’m not explicitly sure but from the large demographic of people on Parler that seem to conflate the First Amendment to apply to social media companies and not who it’s intended for (the feds), I’d personally block that whole site from my IndieWeb setup.
{
"type": "entry",
"published": "2020-08-10T07:23:26.43947-07:00",
"url": "https://v2.jacky.wtf/post/5032f89e-dcb7-45b7-b958-7d242f72042a",
"in-reply-to": [
"https://twitter.com/Cambridgeport90/status/1292803078723219458"
],
"content": {
"text": "I\u2019m not explicitly sure but from the large demographic of people on Parler that seem to conflate the First Amendment to apply to social media companies and not who it\u2019s intended for (the feds), I\u2019d personally block that whole site from my IndieWeb setup.",
"html": "<p>I\u2019m not explicitly sure but from the large demographic of people on Parler that seem to conflate the First Amendment to apply to social media companies and not who it\u2019s intended for (the feds), I\u2019d personally block that whole site from my IndieWeb setup.</p>"
},
"author": {
"type": "card",
"name": "",
"url": "https://v2.jacky.wtf",
"photo": null
},
"post-type": "reply",
"refs": {
"https://twitter.com/Cambridgeport90/status/1292803078723219458": {
"type": "entry",
"url": "https://twitter.com/Cambridgeport90/status/1292803078723219458",
"author": {
"type": "card",
"name": "twitter.com",
"url": "https://twitter.com/Cambridgeport90/status/1292803078723219458",
"photo": null
},
"post-type": "note"
}
},
"_id": "13911704",
"_source": "1886",
"_is_read": true
}
{
"type": "entry",
"published": "2020-08-09T18:41:00+01:00",
"url": "https://www.jvt.me/mf2/2020/08/gedst/",
"category": [
"indieweb"
],
"bookmark-of": [
"http://cascadiainspired.com/excited-about-the-indie-web/"
],
"author": {
"type": "card",
"name": "Jamie Tanna",
"url": "https://www.jvt.me",
"photo": "https://www.jvt.me/img/profile.png"
},
"post-type": "bookmark",
"_id": "13881767",
"_source": "2169",
"_is_read": true
}
{
"type": "entry",
"published": "2020-08-08T09:54:43.51138-07:00",
"url": "https://v2.jacky.wtf/post/79cd622f-7ae5-47cd-a94e-e15bfaf6b4b1",
"category": [
"events"
],
"content": {
"text": "Currently at https://events.indieweb.org/2020/08/indieauth-1-1-identity-protocol-standards-session-6xlxgeCEMgv8 talking shop about http://indieauth.spec.indieweb.org/.",
"html": "<p>Currently at <a href=\"https://events.indieweb.org/2020/08/indieauth-1-1-identity-protocol-standards-session-6xlxgeCEMgv8\">https://events.indieweb.org/2020/08/indieauth-1-1-identity-protocol-standards-session-6xlxgeCEMgv8</a> talking shop about <a href=\"http://indieauth.spec.indieweb.org/\">http://indieauth.spec.indieweb.org/</a>.</p>"
},
"author": {
"type": "card",
"name": "",
"url": "https://v2.jacky.wtf",
"photo": null
},
"post-type": "note",
"_id": "13867702",
"_source": "1886",
"_is_read": true
}
going to part of the IndieAuth Pop-up Session
🗓 9:30-11:30am PDT, Sat 8/8
🎟 RSVP: https://events.indieweb.org/2020/08/indieauth-1-1-identity-protocol-standards-session-6xlxgeCEMgv8
#IndieAuth is the most implemented decentralized #identity #protocol, built on #OAuth 2.
#IndieWeb #OpenWeb #WebIdentity
#noBlockchain needed
{
"type": "entry",
"published": "2020-08-07 10:18-0700",
"rsvp": "yes",
"url": "http://tantek.com/2020/220/t1/",
"category": [
"IndieAuth",
"identity",
"protocol",
"OAuth",
"IndieWeb",
"OpenWeb",
"WebIdentity",
"noBlockchain"
],
"in-reply-to": [
"https://events.indieweb.org/2020/08/indieauth-1-1-identity-protocol-standards-session-6xlxgeCEMgv8"
],
"content": {
"text": "going to part of the IndieAuth Pop-up Session\n\ud83d\uddd3 9:30-11:30am PDT, Sat 8/8\n\ud83c\udf9f RSVP: https://events.indieweb.org/2020/08/indieauth-1-1-identity-protocol-standards-session-6xlxgeCEMgv8\n\n#IndieAuth is the most implemented decentralized #identity #protocol, built on #OAuth 2.\n\n#IndieWeb #OpenWeb #WebIdentity \n#noBlockchain needed",
"html": "going to part of the IndieAuth Pop-up Session<br />\ud83d\uddd3 9:30-11:30am PDT, Sat 8/8<br />\ud83c\udf9f RSVP: <a href=\"https://events.indieweb.org/2020/08/indieauth-1-1-identity-protocol-standards-session-6xlxgeCEMgv8\">https://events.indieweb.org/2020/08/indieauth-1-1-identity-protocol-standards-session-6xlxgeCEMgv8</a><br /><br />#<span class=\"p-category\">IndieAuth</span> is the most implemented decentralized #<span class=\"p-category\">identity</span> #<span class=\"p-category\">protocol</span>, built on #<span class=\"p-category\">OAuth</span> 2.<br /><br />#<span class=\"p-category\">IndieWeb</span> #<span class=\"p-category\">OpenWeb</span> #<span class=\"p-category\">WebIdentity</span> <br />#<span class=\"p-category\">noBlockchain</span> needed"
},
"author": {
"type": "card",
"name": "Tantek \u00c7elik",
"url": "http://tantek.com/",
"photo": "https://aperture-media.p3k.io/tantek.com/acfddd7d8b2c8cf8aa163651432cc1ec7eb8ec2f881942dca963d305eeaaa6b8.jpg"
},
"post-type": "rsvp",
"refs": {
"https://events.indieweb.org/2020/08/indieauth-1-1-identity-protocol-standards-session-6xlxgeCEMgv8": {
"type": "entry",
"url": "https://events.indieweb.org/2020/08/indieauth-1-1-identity-protocol-standards-session-6xlxgeCEMgv8",
"name": "an IndieWeb event",
"post-type": "article"
}
},
"_id": "13839016",
"_source": "1",
"_is_read": true
}
For some reason specific to Bluebird, I can’t get the Microformats JavaScript library working. I can see how this would be a huge road block for anyone wanting to build IndieWeb stuff if a core component of it (Microformats) can’t be worked. Lemme see if I can reproduce this and figure out if it can be patched.
{
"type": "entry",
"published": "2020-08-07T08:05:00.00000-07:00",
"url": "https://v2.jacky.wtf/post/9b364ffa-bc81-4ce5-a77c-67ef02f1355b",
"category": [
"bugs",
"JavaScript"
],
"content": {
"text": "For some reason specific to Bluebird, I can\u2019t get the Microformats JavaScript library working. I can see how this would be a huge road block for anyone wanting to build IndieWeb stuff if a core component of it (Microformats) can\u2019t be worked. Lemme see if I can reproduce this and figure out if it can be patched.",
"html": "<p>For some reason specific to Bluebird, I can\u2019t get the Microformats JavaScript library working. I can see how this would be a huge road block for anyone wanting to build IndieWeb stuff if a core component of it (Microformats) can\u2019t be worked. Lemme see if I can reproduce this and figure out if it can be patched.</p>"
},
"author": {
"type": "card",
"name": "",
"url": "https://v2.jacky.wtf",
"photo": null
},
"post-type": "note",
"_id": "13832907",
"_source": "1886",
"_is_read": true
}
{
"type": "entry",
"published": "2020-08-06 13:08-0700",
"url": "http://tantek.com/2020/219/b2/",
"category": [
"backfeed"
],
"in-reply-to": [
"https://github.com/snarfed/bridgy/issues"
],
"name": "backfeed GitHub labels on your issues",
"content": {
"text": "On GitHub, project team members are able to add labels to your issues on a project. If your issue is a POSSE copy of an original post on your site, Bridgy should backfeed these as \"tag-of\" responses to the original post.\n\n\nBridgy Publish already \nsupports POSSEing tag-of posts in reply to GitHub issues, \nas labels on those issues, and this is the backfeed counterpart. Brainstormed here: \nhttps://indieweb.org/tag-reply#How_to_post_a_tag-reply.\n\n\nThis is similar to \nissue #776 \nwhich is the same backfeed feature request but for Flickr.\n\n\nThis is also the \u201clabeled\u201d specific subfeature of \nissue #833 \nwhich documents many more backfeed for GitHub requests.\n\n\nAnd similar to \nthis comment on #811 \n(original post) \nrequesting Bridgy Publish untag-of support, \nit\u2019s worth considering Bridgy Backfeed untag-of support \n(the \u201cunlabeled\u201d specific subfeature of #833), \nso when someone removes a label from your issue, your original issue post is notified. However, the \nbrainstorming of how to markup untagging \nis still ongoing, and thus may need to wait for more discussion before implementing.\n\n\nLabel: backfeed.",
"html": "<p>\nOn GitHub, project team members are able to add labels to your issues on a project. If your issue is a POSSE copy of an original post on your site, Bridgy should backfeed these as \"tag-of\" responses to the original post.\n</p>\n<p>\nBridgy Publish already \n<a href=\"https://github.com/snarfed/bridgy/issues/811\">supports POSSEing tag-of posts in reply to GitHub issues</a>, \nas labels on those issues, and this is the backfeed counterpart. Brainstormed here: \n<a href=\"https://indieweb.org/tag-reply#How_to_post_a_tag-reply\">https://indieweb.org/tag-reply#How_to_post_a_tag-reply</a>.\n</p>\n<p>\nThis is similar to \n<a href=\"https://github.com/snarfed/bridgy/issues/776\">issue #776</a> \nwhich is the same backfeed feature request but for Flickr.\n</p>\n<p>\nThis is also the \u201clabeled\u201d specific subfeature of \n<a href=\"https://github.com/snarfed/bridgy/issues/833\">issue #833</a> \nwhich documents many more backfeed for GitHub requests.\n</p>\n<p>\nAnd similar to \n<a href=\"https://github.com/snarfed/bridgy/issues/811#issuecomment-382469530\">this comment on #811</a> \n(<a href=\"https://tantek.com/2018/108/t2/untag-of-bridgy-publish-github-label\">original post</a>) \nrequesting Bridgy Publish untag-of support, \nit\u2019s worth considering Bridgy Backfeed untag-of support \n(the \u201cunlabeled\u201d specific subfeature of #833), \nso when someone removes a label from your issue, your original issue post is notified. However, the \n<a href=\"https://indieweb.org/untag#How_to_mark_up.3F\">brainstorming of how to markup untagging</a> \nis still ongoing, and thus may need to wait for more discussion before implementing.\n</p>\n<p>\nLabel: <span class=\"p-category\">backfeed</span>.\n</p>"
},
"author": {
"type": "card",
"name": "Tantek \u00c7elik",
"url": "http://tantek.com/",
"photo": "https://aperture-media.p3k.io/tantek.com/acfddd7d8b2c8cf8aa163651432cc1ec7eb8ec2f881942dca963d305eeaaa6b8.jpg"
},
"post-type": "reply",
"refs": {
"https://github.com/snarfed/bridgy/issues": {
"type": "entry",
"url": "https://github.com/snarfed/bridgy/issues",
"name": "GitHub project \u201cbridgy\u201d",
"post-type": "article"
}
},
"_id": "13813244",
"_source": "1",
"_is_read": true
}
{
"type": "entry",
"published": "2020-08-06 11:46-0700",
"url": "http://tantek.com/2020/219/b1/",
"category": [
"enhancement",
"question",
"needs proposed resolution"
],
"in-reply-to": [
"https://github.com/microformats/microformats2-parsing/issues"
],
"name": "Should we specify a MIME type / Content-Type for canonical JSON from parsed mf2?",
"content": {
"text": "There has been some past brainstorming about possible MIME types for the JSON resulting from a compliant microformats2 parsing implementation:\nmicroformats2-mime-type. \nIt seems one in particular, application/mf2+json, has seen some adoption in the wild: https://indieweb.org/application/mf2+json.\nShould we specify an explicit MIME type for the parsed JSON result of an mf2 parser? And if so, should we adopt application/mf2+json or some other alternative?\n\n\nLabels: enhancement, \nquestion, \nneeds proposed resolution.",
"html": "<p>\nThere has been some past brainstorming about possible MIME types for the JSON resulting from a compliant microformats2 parsing implementation:\n<a href=\"https://microformats.org/wiki/microformats2-mime-type\">microformats2-mime-type</a>. \nIt seems one in particular, <code>application/mf2+json</code>, has seen some adoption in the wild: <a href=\"https://indieweb.org/application/mf2+json\">https://indieweb.org/application/mf2+json</a>.\nShould we specify an explicit MIME type for the parsed JSON result of an mf2 parser? And if so, should we adopt <code>application/mf2+json</code> or some other alternative?\n</p>\n<p>\nLabels: <span class=\"p-category\">enhancement</span>, \n<span class=\"p-category\">question</span>, \n<span class=\"p-category\">needs proposed resolution</span>.\n</p>"
},
"author": {
"type": "card",
"name": "Tantek \u00c7elik",
"url": "http://tantek.com/",
"photo": "https://aperture-media.p3k.io/tantek.com/acfddd7d8b2c8cf8aa163651432cc1ec7eb8ec2f881942dca963d305eeaaa6b8.jpg"
},
"post-type": "reply",
"refs": {
"https://github.com/microformats/microformats2-parsing/issues": {
"type": "entry",
"url": "https://github.com/microformats/microformats2-parsing/issues",
"name": "GitHub project \u201cmicroformats2-parsing\u201d",
"post-type": "article"
}
},
"_id": "13813245",
"_source": "1",
"_is_read": true
}
I’m thinking about Webmention feeds and I’m noticing now how special they might be compared to some other feeds. For example, if I wanted to do something that just spat out every Webmention I’ve received, that’d get noisy fast if I have a very busy stream of incoming Webmentions. I do want to consider generating a feed that’d resort itself based on the last recent activity but also provide context about the volume of activity that happened on said resource. Perhaps making the specific feed of activity of said resource be the target of the feed as well as linking to the resource in question would work. Lots of variability here to be honest and I’ve an interest in keeping the “how” this works very straight forward. This would be all stuff you’d see in a social reader, for some context.
{
"type": "entry",
"published": "2020-08-06T12:21:00.00000-07:00",
"url": "https://v2.jacky.wtf/post/464372b9-9221-4a92-b9f2-70a2e65208c8",
"content": {
"text": "I\u2019m thinking about Webmention feeds and I\u2019m noticing now how special they might be compared to some other feeds. For example, if I wanted to do something that just spat out every Webmention I\u2019ve received, that\u2019d get noisy fast if I have a very busy stream of incoming Webmentions. I do want to consider generating a feed that\u2019d resort itself based on the last recent activity but also provide context about the volume of activity that happened on said resource. Perhaps making the specific feed of activity of said resource be the target of the feed as well as linking to the resource in question would work. Lots of variability here to be honest and I\u2019ve an interest in keeping the \u201chow\u201d this works very straight forward. This would be all stuff you\u2019d see in a social reader, for some context.",
"html": "<p>I\u2019m thinking about Webmention feeds and I\u2019m noticing now how special they might be compared to some other feeds. For example, if I wanted to do something that just spat out every Webmention I\u2019ve received, that\u2019d get noisy fast if I have a very busy stream of incoming Webmentions. I do want to consider generating a feed that\u2019d resort itself based on the last recent activity but also provide context about the volume of activity that happened on said resource. Perhaps making the specific feed of activity of said resource be the target of the feed as well as linking to the resource in question would work. Lots of variability here to be honest and I\u2019ve an interest in keeping the \u201chow\u201d this works very straight forward. This would be all stuff you\u2019d see in a social reader, for some context.</p>"
},
"author": {
"type": "card",
"name": "",
"url": "https://v2.jacky.wtf",
"photo": null
},
"post-type": "note",
"_id": "13811330",
"_source": "1886",
"_is_read": true
}
I think I either want native “link quoting” / citations in my site and in Indigenous or someway to automatically do that. This hints back at a conversation in the IndieWeb channels about how posting interfaces have homogenized to be “intelligent” (in the sense that by setting certain fields, the post type can be inferred). I think it’s more constraint driven for sure but due to the hyper plurality of content types we’ve engaged with, it’s easier to throw users a kitchen sink versus providing more intelligent interfaces. We’re making strides towards that though!
{
"type": "entry",
"published": "2020-08-05T06:40:00.00000-07:00",
"url": "https://v2.jacky.wtf/post/299e4167-19f2-41bf-9dc2-7dc539feca79",
"category": [
"indieweb",
"thoughts"
],
"content": {
"text": "I think I either want native \u201clink quoting\u201d / citations in my site and in Indigenous or someway to automatically do that. This hints back at a conversation in the IndieWeb channels about how posting interfaces have homogenized to be \u201cintelligent\u201d (in the sense that by setting certain fields, the post type can be inferred). I think it\u2019s more constraint driven for sure but due to the hyper plurality of content types we\u2019ve engaged with, it\u2019s easier to throw users a kitchen sink versus providing more intelligent interfaces. We\u2019re making strides towards that though!",
"html": "<p>I <em>think</em> I either want native \u201clink quoting\u201d / citations in my site and in Indigenous <em>or</em> someway to automatically do that. This hints back at a conversation in the IndieWeb channels about how posting interfaces have homogenized to be \u201cintelligent\u201d (in the sense that by setting certain fields, the post type can be inferred). I think it\u2019s more constraint driven for sure but due to the hyper plurality of content types we\u2019ve engaged with, it\u2019s easier to throw users a kitchen sink versus providing more intelligent interfaces. We\u2019re making strides towards that though!</p>"
},
"author": {
"type": "card",
"name": "",
"url": "https://v2.jacky.wtf",
"photo": null
},
"post-type": "note",
"_id": "13775069",
"_source": "1886",
"_is_read": true
}