Total of 313 posts

Herein you’ll find articles on a very wide variety of topics about technology in the consumer space (mostly) and items of personal interest to me. I have also participated in and created several podcasts most notably Pragmatic and Causality and all of my podcasts can be found at The Engineered Network.

Building Your Own Bitcoin Lightning Node

After my previous attempts to build my own node to take control of my slowly growing podcast streaming income didn’t go so well I decided to bite the bullet and build my own Lightning Node with new hardware. The criteria was:

  1. Minimise expenditure and transaction fees (host my own node)
  2. Must be always connected (via home internet is fine)
  3. Use low-cost hardware and open-source software with minimal command-line work

Because of the above, I couldn’t use my Macbook Pro since that comes with me regularly when I leave the house. I tried to use my Synology, but that didn’t work out. The next best option was a Raspberry Pi, and two of the most popular options out there are the RaspiBolt and RaspiBlitz. Note: Umbrel is coming along but not quite as far as the other two.

The Blitz was my choice as it seems to be more popular and I could build it easily enough myself. The GitHub Repo is very detailed and extremely helpful. This article is not intended to just repeat those instructions, but rather describe my own experiences in building my own Blitz.

Parts

The GitHub instructions suggest Amazon links, but in Australia Amazon isn’t what it is in the States or even Europe. So instead I sourced the parts from a local importer of Rasperry Pi parts. I picked from the “Standard” list:

Core Electronics

  • $92.50 / Raspberry Pi 4 Model B 4GB
  • $16.45 / Raspberry Pi 4 Power Supply (Official) - USB-C 5.1V 15.3W (White)
  • $23.50 / Aluminium Heatsink Case for Raspberry Pi 4 Black (Passive Cooling, Silent)
  • $34.65 / Waveshare 3.5inch LCD 480x320 (The LCD referred to was a 3.5" RPi Display, GPIO connection, XPT2046 Touch Controller but they had either no stock on Amazon or wouldn’t ship to Australia)

Blitz All the parts from Core Electronics

UMart

  • $14 / Samsung 32GB Micro SDHC Evo Plus W90MB Class 10 with SD Adapter

On Hand

Admittedly a 1TB SSD and Case would’ve cost an additional $160 AUD, which in future I will extend probably to a fully future-proof 2TB SSD but at this point the Bitcoin Blockchain uses about 82% of that so a bigger SSD is on the cards for me, in the next 6-9 months time for sure.

Total cost: $181.10 AUD (about $139 USD or 300k Sats at time of writing)

Blitz The WaveShare LCD Front View

Blitz The WaveShare LCD Rear View

Assembly

The power supply is simple: unwrap, plug in to the USB-C Power Port and done. The Heatsink comes with some different sized thermal pads to sandwich between the heatsink and the key components on the Pi motherboard and four screws to clamp the two pieces together around the motherboard. Finally lining up the screen with the outer-most pins on the I/O Header and gently pressing them together. They won’t sit flat against the HeatSink/case but they don’t have to, to connect well.

Blitz The Power Supply

Blitz The HeatSink

Blitz The Raspberry Pi 4B Motherboard

Burning the Image

I downloaded the boot image from the GitHub repo, and used Balena Etcher to write it on my Macbook Pro. Afterward you insert that into the Raspberry Pi, connected up the SSD to the motherboard side USB3.0 port, connect up an Ethernet cable and then power it up!

Installing the System

If everything is hooked up correctly (and you have a router/DHCP server on your hardwired ethernet you just connected it to) the screen should light up with the DHCP allocated IP Address you can reach it on with instructions on how to SSH via the terminal, like “ssh [email protected]” or similar. Open up Terminal, enter that and you’ll get a nice neat blue-screen with the same information on it. From here everything is done via the menu installer.

If you get kicked out of that interface just enter ‘raspiblitz’ and it will restart the menu.

Getting the Order Right

  1. Pick Your Poison For me I chose BitCoin and Lightning which is the default. There are other Crypto-currencies if that’s your choice then set your passwords and please use a Password manager with at least 32 characters - make it as secure as you can from Day One!
  2. TOR vs Public IP Some privacy nuts run behind TOR to obscure their identity and location. I’ve done both and can tell you that TOR takes a lot longer to sync and access and will kill off a lot of apps and makes opening channels to some other nodes and services difficult or impossible. For me, I just wanted a working node that was as interoperable as possible so I chose Public IP.
  3. Let the BlockChain Sync Once your SSD is formatted, if you have the patience then I recommend syncing the Blockchain from scratch. I already had a copy of it that I SCP’d across from my Synology and it saved me about 36 hours but it also caused my installer to ungracefully exit and it took me another day of messing with the command line to get it to start again and complete the installation. In retrospect, not a time saver end to end but your mileage may vary.
  4. Set up a New Node Or in my case, I recovered my old node at this point by copying the channel.backup over but for most others it’s a New Node and a new Wallet and for goodness sake when you make a new wallet; KEEP A COPY OF YOUR SEED WORDS!!!
  5. Let Lightning “Sync” It’s actually validating blocks technically but this also takes a while. For me it took nearly 6 hours for both Lightning and Bitcoin blocks to sync.

Blitz The Final Assembled Node up and Running

My Money from Attempt 2 on the Synology Recovered!

I was able to copy the channel.backup and wallet.dat files from the Synology and was able to successfully recover my $60 AUD investment from my previous attempts, so that’s good! (And it worked pretty easily actually)

In order to prevent any loss of wallet, I’ve also added a USB3.0 Thumb Drive to the other USB3.0 port and set up “Static Channel Backup on USB Drive” which required a brief format to EXT4 but worked without any real drama.

Conclusion

Building the node using a salvaged SSD cost under $200 AUD and took about 2 days to sync and set up. Installing the software and setting up all the services is another story for another post, but it’s working great!

BitCoin, Lightning and Patience

I’ve been vaguely aware of BitCoin for a decade but never really dug into it until recently, as a direct result of my interest in the Podcasting 2.0 team.

My goals were:

  1. Minimise expenditure and transaction fees
  2. Use existing hardware and open-source software
  3. Setup a functional lightning node to both make and accept payments

I’m the proud owner of a Synology, and it can run docker, and you can run BitCoin and Lightning in Docker containers? Okay then…this should be easy enough, right?

BitCoin Node Take 1

I set up the kylemanna/bitcoind docker on my Synology and started it syncing to the Mainnet blockchain. About a week later and I was sitting at 18% complete and averaging 1.5% per day and dropping. Reading up on this and the problem was two-fold: validating the blockchain is a CPU and HDD/SSD intensive task and my Synology had neither. I threw more RAM at it (3GB out of the 4GB it had) with no difference in performance, set the CPU restrictions to give the Docker the most performance possible with no difference and basically ran out of levers to pull.

I then learned it’s possible to copy a blockchain from one device to another and the Raspberry Pi’s sold as your own private node come with the blockchain pre-synced (up to the point they’re shipped) so they don’t take too long to catch up to the front of the chain. I then downloaded BitCoin Core for MacOS and set it running. After two days it had finished (much better) and I copied the directories to the Synology only to find that the settings on BitCoin Core were to “prune” the blockchain after validation, meaning the entire blockchain was no longer stored on my Mac, and the docker container would need to start over.

Ugh.

So I disabled pruning on the Mac, and started again. The blockchain was about 300GB (so I was told) and with my 512GB SSD on my MBP I thought that would be enough, but alas no, as the amount of free space diminished at a rapid rate of knots, I madly off-loaded and deleted what I could finishing with about 2GB to spare and the entire blockchain and associated files weighed in at 367GB.

Transferring them to the Synology and firing up the Docker…it worked! Although it had to revalidate the 6 most recent blocks (taking about 26 minutes EVERY time you restarted the BitCoin docker) it sprang to life nicely. I had a BitCoin node up and running!

Lightning Node Take 1

There are several docker containers to choose from, the two most popular seemed to be LND and c-Lightning. Without understanding the differences I went with the container that was said to be more lightweight and work better on a Synology: c-Lightning.

Later I was to discover that more plugins, applications, GUIs, relays (Sphinx for example) only work with LND and require LND Macaroons, which c-Lightning doesn’t support. Not only that design decisions by the c-Lightning developers to only permit single connections between nodes makes building liquidity problematic when you’re starting out. (More on that in another post someday…)

After messing around with RPC for the cLightning docker to communicate with the KyleManna Bitcoind docker, I realised that I needed to install ZMQ support since RPC Username and Password authentication were being phased out in preference for a token authentication through a shared folder.

UGH

I was so frustrated at losing 26 minutes every time I had to change a single setting in the Bitcoin docker, and in an incident overnight both dockers crashed, didn’t restart and then took over a day to catch up to the blockchain again. I had decided more or less at this point to give up on it.

SSD or don’t bother

Interestingly my oldest son pointed out that all of the kits for sale used SSDs for the Bitcoin data storage - even the cheapest versions. A bit more research and it turns out that crunching through the blockchain is less of a CPU intensive exercise and more of a data store read/write intensive exercise. I had a 512GB Samsung USB 3.0 SSD laying around and in a fit of insanity decided to try connecting it to the Synology’s rear port, shift the entire contents of the docker shared folders (that contained all of the blocks and indexes) to that SSD and try it again.

Oh My God it was like night and day.

Both docker containers started, synced and were running in minutes. Suddenly I was interested again!

Bitcoin Node Take 2

With renewed interest I returned to my previous headache - linking the docker containers properly. The LNCM/Bitcoind docker had precompiled support for ZMQ and it was surprisingly easy to set up the docker shared file to expose the token I needed for authentication with the cLightning docker image. It started up referencing the same docker folder (now mounted on the SSD) and honestly, seemed to “just work” straight up. So far so good.

Lightning Node Take 2

This time I went for the more-supported LND, and picked one that was quite popular by Guggero, and also spun it up rather quickly. My funds on my old cLightning node would simply have to remain trapped until I could figure out how to recover them in future.

Host-Network

The instructions I had read all related to TestNet, and advised not to use money you weren’t prepared to lose. I set myself a starting budget of $40 AUD and tried to make this work. Using the Breez app on iOS and their integration with MoonPay I managed to convert about 110k Sats. The next problem was getting them from Breez to my own Node and my attempts with Lightning failed with “no route.” (I learned later I needed channels…d’uh) Sending via BitCoin was the only option. “On-chain” they call it. This cost me a lot of Sats, but I finally had some Sats on my Node.

Satoshi’s

BitCoin has a few quirky little problems. One interesting one is that a single BitCoin is worth a LOT of money - currently 1 BTC = $62,000.00 AUD. So it’s not a practical measure and hence BitCoin is more commonly referred to in Satoshi’s which are 1/100,000,000th of a BitCoin. BitCoin is a crypto-currency which is transacted on the BitCoin blockchain, via the BitCoin network. Lightning is a Layer 2 network that also deals in BitCoin but in smaller amounts, peer to peer connected via channels and because the values are much smaller is regularly transacted in values of Satoshi’s.

Everything you do requires Satoshi’s (SATS). It costs SATS to fund a channel. It costs SATS to close a channel. I couldn’t find out how to determine the minimum amount of Sats needed to open a channel without first opening one via the command line. I only had a limited number of SATs to play with so I had to choose carefully. Most channels wanted 10,000 or 20,000 but I managed a find a few that only required 1,000. The initial thought was to open as many channels as you could then make some transactions and your inbound liquidity will improve as others in the network transact.

Services exist to help build that inbound liquidity, without which, you can’t accept payments from anyone else. Another story for a future post.

Anything On-Chain Is Slow and Expensive

For a technology that’s supposed to be reducing fees overall, Lightning seems to cost you a bit up-front to get into it, and anytime you want to shuffle things around, it costs SATS. I initially bought into it wishing to fund my own node and try for that oft-touted “self-soverignty” of BitCoin, but to achieve that you have to invest some money to get started. In the end however I hadn’t invested enough because my channels I opened didn’t allow inbound payments.

I asked some people to open some channels to me and give me some inbound liquidity however not a single one of them successfully opened. My BitCoin and Lightning experiment had ground to a halt, once again.

At first I experimented with TOR, then by publishing on an external IP address, port-forwarding to expose the Lightning external access port 9735 to allow incoming connections. Research into why highlighted that I needed to recreate my dockers but connect them to a custom Docker network and then resync the containers otherwise the open channel attempts would continue to fail.

I did that and it still didn’t work.

Then I stumbled across the next idea: you needed to modify the Synology Docker DSM implementation to allow direct mounting of the Docker images without them being forced through a Double-NAT. Doing so was likely to impact my other, otherwise perfectly happily running Dockers.

UGH

That’s it.

I’m out.

Playing with BitCoin today feels like programming COBOL for a bank in the 80s

Did you know that COBOL is behind nearly half of all financial transactions in 2017? Yes and the world is gradually ripping it out (thankfully).

IDENTIFICATION DIVISION.
   PROGRAM-ID. CONDITIONALS.
   DATA DIVISION.
     WORKING-STORAGE SECTION.
     *> I'm not joking, Lightning-cli and Bitcoin-cli make me think I'm programming for a bank
     01 NUM1 SATSJOHNHAS 0(0).
   PROCEDURE DIVISION.
     MOVE 20000 TO NUM1.
     IF NUM1 > 0 THEN
       DISPLAY 'YAY I HAZ 20000 SATS!'
     END-IF
     *> I'd like to make all of transactions using the command line, just like when I do normal banking...oh wait...
     EVALUATE TRUE
       WHEN SATS = 0
         DISPLAY 'NO MORE SATS NOW :('
     END-EVALUATE.
   STOP RUN.

There is no doubt there’s a bit geek-elitism amongst many of the people involved with BitCoin. Comments like “Don’t use a GUI, to understand it you MUST use the command line…” reminds me of those that whined about the Macintosh in 1984 having a GUI. A “real” computer used DOS. OMFG seriously?

A real financial system is as painless for the user as possible. Unbeknownst to me, I’d chosen a method that was perhaps the least advisable: the wrong hardware running the wrong software, running a less-compatible set of dockers and my conclusion was that setting up your own Node that you control is not easy.

It’s not intuitive either and it will make you think about things like inbound liquidity that you never thought you’d need to know, since you’re geek - not an investment banker. I suppose the point is that owning your own bank means you have to learn a bit about how a bank needs to work and that takes time and effort.

If you’re happy to just pay someone else to build and operate a node for you then that’s fine and that’s just what you’re doing today with any bank account. I spent weeks learning just how much I don’t want to be my own bank - thank you very much, or at least I didn’t want to using the equipment that I had laying about and living in the Terminal.

Synology as a Node Verdict

Docker was not reliable enough either. In some instances I would modify a single dockers configuration file and restart the container only get “Docker API failed”. Sometimes I could recover by picking the Docker Container I thought had caused the failure (most likely the one I modified but not always) by clearing the container and restarting it.

Other times I had to completely reboot the Synology to recover it and sometimes I had to do both for Docker to restart. Every restart of the Bitcoin Container and there would go another half an hour restarting and then the container would “go backwards” and be 250 blocks behind taking a further 24-48 hours of resynchronising with the blockchain before the Lightning Container could then resynchronise with it. All the while the node was offline.

Unless your Synology is running SSDs, has at least 8GB of RAM, is relatively new and you don’t mind hacking your DSM Docker installation, you could probably make it work, but it’s horses for courses in the end. If you have an old PC laying about then use that. If you have RAM and SSD on your NAS then build a VM rather than use Docker, maybe. Or better yet, get a Raspberry Pi and have a dedicated little PC that can do the work.

Don’t Do What I Did

Money is Gone

The truth is in an attempt to get incoming channel opens working, I flicked between Bridge and Host and back again, opening different ports with Socks failed errors and finally gave up when after many hours the LND docker just wouldn’t connect via ZMQ any more.

And with that my $100 AUD investment is now stuck between two virtual wallets.

I will keep trying and report back but at this point my intention is to invest in a Raspberry Pi to run my own Node. I’ll let you know how that goes in due course.

Podcasting 2.0 Addendum

I recently wrote about Podcasting 2.0 and thought I should add a further amendment regarding their goals. I previously wrote:

To solve the problems above there are a few key angles being tackled: Search, Discoverability and Monetisation.

I’d like to add a fourth key angle to that, which I didn’t think at the time should be listed as it’s own however having listened more to Episodes 16 and 17 and their intention to add XML tags for IRC/Chat Room integration I think I should add the fourth key angle: Interactivity.

Interactivity

The problem with broadcast historically is that audience participation is difficult given the tools and effort required. Pick up the phone, make a call, you need a big incentive (think cash prizes, competitions, discounts, something!) or audiences just don’t participate. It’s less personal and with less of a personal connection the desire for listeners to connect is much less.

In podcasting as an internet-first application and being far more personal, the bar is set differently and we can think of real-time feedback methods as verbal via a dial-in/patch-through to the live show or written via messaging, like a chat room. There are also non-real-time methods predominantly via webforms and EMail. With contact EMails already in the RSS XML specification, adding a webform submission entry might be of some use (perhaps < podcast:contactform > with a url=“https://contact.me/form"), but real-time is far more interesting.

Real Time Interactivity

In podcasting initially (like so many internet-first technology applications) geeks that understood how it works, led the way. That is to say with podcasts originally there was a way for a percentage of the listeners to use IRC as a Chat Room (Pragmatic did this for the better part of a year in 2014, as well as other far more popular shows like ATP, Back To Work etc.) to get real-time listener interaction during a podcast recording, albeit with a slight delay between audio out and listener response in the chat room.

YouTube introduced live streaming and live chat with playback that integrated the chat room with the video content to lower the barrier of entry for their platform. For equivalent podcast functionality to go beyond the geek-% of the podcast listeners, podcast clients will need to do the same. In order for podcast clients to be pressured to support it, standardisation of the XML tags and backend infrastructure is a must.

The problem with interactivity is that whilst it starts with the tag, it must end with the client applications otherwise only the geek-% of listeners will use it as they do now.

From my own experiences with live chat rooms during my own and other podcasts, people that are able to tune in to a live show and be present (lots of people just “sit” in a channel and aren’t actually present) is about 1-2% of your overall downloads and that’s for a technical podcast with a high geek-%. I’ve also found there are timezone-effects such that if you podcast live during different times of the day or night directly impacts those percentages even further (it’s 1am somewhere in the world right now, so if your listeners live in that timezone chances are they won’t be listening live).

The final concern is that chat rooms only work for a certain kind of podcast. For me, it could only potentially work with Pragmatic and in my experience I wanted Pragmatic to be focussed and chat rooms turned out to be a huge distraction. Over and again my listeners reiterated that one of the main attractions of podcasts was their ability to time-shift and listen to them when they wanted to listen to them. Being live to them was a minus not a plus.

For these reasons I don’t see that this kind of interactivity will uplift the podcasting ecosystem for the vast majority of podcasters, though it’s certainly nice to have and attempt to standardise.

Crowd-sourced Chapters

Previously I wrote:

The interesting opportunity that Adam puts forward with chapters is he wants the audience to be able to participate with crowd-sourced chapters as a new vector of audience participation and interaction with podcast creators.

Whilst I looked at this last time from a practical standpoint of “how would I as a podcaster use this?” concluding that I wouldn’t use it since I’m a self-confessed control-freak, but I didn’t fully appreciate the angle of audience interaction. I think for podcasts that have a truly significant audience with listeners that really want to help out (but can’t help financially) this feature provides a potential avenue to assist in a non-financial aspect, which is a great idea.

Crowd-source Everything?

(Except recording the show!)

From pre-production to post-production any task in the podcast creation chain could be outsourced to an extent. The pre-production dilemma could look like a feed level XML Tag < podcast:proposedtopics > to a planned topic list (popular podcasts currently use Twitter #Tags like #AskTheBobbyMcBobShow), to cut-out centralised platforms like Twitter from the creation chain in the long term. Again, only useful for certain kinds of shows, but could also include a URL Link to a shared document (probably a JSON file), an episode index reference (i.e. Currently released episode is 85, proposed topics for Episode 86, could also be an array for multiple episodes.)

The post-production dilemma generally consists of show notes, chapters (solution in progress) and audio editing. Perhaps a similar system to crowd-sourced chapters could be used for show notes that could include useful/relevant links for the current episode that aren’t/can’t be easily embedded as Chapter markers.

In either case there’s no reason why it couldn’t work the same way as crowd-sourced chapter markers. The podcaster could also have (with sufficient privileges) the administrative access to add/modify remove content from either of these, with guests also having read/write access. With an appropriate client tool this would then eliminate the plethora of different methods in use today: shared google documents being quite popular with many podcasters today, will not be around indefinitely.

All In One App?

Of course the more features we pile into the Podcasting client app, the more difficult it becomes to write and maintain. Previously an excellent programmer, come podcaster, come audiophile like Marco Arment, could create Overcast. With lightning network integration, plus crowd-sourced chapters, shared document support (notes etc) and a text chat client (IRC) the application is quickly becoming much heavier and complex, with fewer developers with the knowledge in each dimension to create an all-in-one app client.

The need for better frameworks to make feature integration easier for developers is obvious. There may well be the need to two classes of app or at least two views: the listener view and the podcaster view, or simply multiple apps for different purposes. Either way it’s interesting to see where the Tag + Use Case + Tool-chain can lead us.

Podcasting 2.0

I’ve been podcasting from close to a decade and whilst I’m not what some might refer to as the “Old Guard” I’ve come across someone that definitely qualifies as such: Adam Curry.

Interestingly when I visited Houston in late 2019 pre-COVID19 my long-time podfriend Vic Hudson suggested I catch up with Adam as he lived nearby and referred to him as the “Podfather.” I had no idea who Adam was at that point and thought nothing of it at the time and although I caught up with Manton Reece at the IndieWeb Meetup in Austin I ran out of time for much else. Since then a lot has happened and I’ve come across Podcasting 2.0 and thus began my somewhat belated self-education of my pre-podcast-involvement podcasting history of which I had clearly been ignorant until recently.

In the first episode of Podcasting 2.0, “Episode 1: We are upgrading podcasting” on the 29th of August, 2020 at about 17 minutes in, Adam regales the story of when Apple and Steve Jobs wooed him with regards to podcasting as he handed over his own Podcast Index as it stood at the time to Apple as the new custodians. He refers to Steve Jobs' appearance at D3 and at 17:45, Steve defined podcasting as being iPod + Broadcasting = Podcasting, further describing it as “Wayne’s World for Podcasting” and even plays a clip of Adam Curry complaining about the unreliability of his Mac.

The approximate turn of events thereafter: Adam hands over podcast index to Apple, Apple builds podcasting into iTunes and their iPod line up and become the largest podcast index, many other services launch but indies and small networks dominate podcasting for the most part but for the longest time Apple didn’t do much at all to extend podcasting. Apple added a few RSS Feed namespace tags here and there but did not attempt to monetise Podcasting even as many others came into the Podcasting space, bringing big names from conventional media and with them many companies starting or attempting to convert podcast content into something that wasn’t as open as it had been with “exclusive” pay-for content.

What Do I Mean About Open?

To be a podcast by its original definition it must contain an RSS Feed, that can be hosted on any machine serving pages to the internet, readable by any other machine on the internet with an audio tag referring to an audio file that can be streamed or downloaded by anyone. A subscription podcast requires login credentials of some kind, usually associated with a payment scheme, in order to listen to the audio of those episodes. Some people draw the line at free = open (and nothing else), others are happy with the occasional authenticated feed that’s still available on any platform/player as that still presents an ‘open’ choice, but much further beyond that (won’t play in any player, not everyone can find/get the audio) and things start becoming a bit more closed.

Due to their open nature, tracking of podcast listeners, demographics and such is difficult. Whilst advertisers see this as a minus, most privacy conscious listeners see this as a plus.

Back To The History Lesson

With big money and big names a new kind of podcast emerged, one behind a paywall with features and functionality that other podcast platforms didn’t or couldn’t have with a traditional open podcast using current namespace tags. With platforms scaling and big money flowing into podcasting, it effectively brought down the average ad-revenue across the board in podcasting and introduced more self-censorship and forced-censorship of content that previously was freely open.

With Spotify and Amazon gaining traction, more multi-million dollar deals and a lack of action from Apple, it’s become quite clear to me that podcasting as I’ve known it in the past decade is in a battle with more traditional, radio-type production companies with money from their traditional radio, movie and music businesses behind them. The larger the more closed podcast eco-systems become, the harder it then becomes for those that aren’t anointed by those companies as being worthy, to be heard amongst them.

Advertisers instead of spending time and energy with highly targeted advertising by carefully selecting shows (and podcasters) individually to attract their target demographic, instead they start dealing only with the bigger companies in the space since they want demographics from user tracking with bigger companies claiming a large slice of the audience they then over-sell their ad-inventory leading to lower-value DAI and less-personal advertising further driving down ad-revenues.

(Is this starting to sound like radio yet? I thought podcasting was supposed to get us away from that…)

Finally another issue emerged: that of controversial content. What one person finds controversial another person finds acceptable. With many countries around the world, each with different laws regarding freedom of speech and with people of many different belief systems, having a way to censor content with a fundamentally open ecosystem (albeit with partly centralised search) was a lever that would inevitably be pulled at some point.

As such many podcasts have been removed from different indexes/directories for different reasons, some more valid than others perhaps, however that is a subjective measure and one I don’t wish to debate here. If podcasts are no longer open then their corporate controller can even more easily remove them in part or in whole as they control both the search and the feed.

To solve the problems above there are a few key angles being tackled: Search, Discoverability and Monetisation.

Search

Quick and easy, the Podcast Index is a complete list of any podcast currently available that’s been submitted. It isn’t censored and is operated and maintained by the support of it’s users. As it’s independent there is no hierarchy to pressure the removal of content from it.

Monetisation

The concept here is ingenuous but requires a leap of faith (of a sort). Bitcoin or rather Lightning, which is a micro-transaction layer that sits aside Bitcoin. If you are already au fait with having a Bitcoin Node, Lightning Node and Wallet then there’s nothing for me to add but the interesting concept is this: by submitting your Node address in your Podcast RSS feed (using the podcast:value tag) a compliant Podcast player can then optionally use the KeySend Lightning command to send a periodic payment “as you listen.” It’s voluntary but it’s seamless.

The podcaster sets a suggested rate in Sats (Satoshis) per minute of podcast played (recorded minute - not played minute if you’re listening at 2x, and the rate is adjustable by the listener) to directly compensate the podcast creator for their work. You can also “Boost” and provide one-off payments via a similar mechanism to support your podcast creator.

The transactions are so small and carry such minimal transaction fees that effectively the entire amount is transferred from listener to podcaster without any significant middle-person skimming off the top in a manner that both reflects the value in time listened vs created and without relying on a single piece of centralised infrastructure.

Beyond this the podcaster can choose additional splits for the listener streaming Sats to go to their co-hosts, to the podcast player app-developer and more. Imagine being able to directly compensate audio editors, artwork contributors, hosting providers all directly and fairly based on listeners actually consuming the content in real time.

This allows a more balanced value distribution and protects against the current non-advertising podcast-funding model via a support platform like Patreon and Patreon (oh I mean Memberful but that’s actually Patreon ). When Patreon goes out of business all of those supportive audiences will be partly crippled as their creators scramble to migrate their users to an alternative. The question is will it be another centralised platform or service, or a decentralised system like this?

That’s what’s so appealing about the Podcasting 2.0 proposition: it’s future proof, balanced and sensible and it avoids the centralisation problems that have stifled creativity in the music and radio industries in the past. There’s only one problem and it’s a rather big one: the lack of adoption of Lightning and Bitcoin. Currently only Sphinx supports podcast KeySend at the time of publishing and adding more client applications to that list of one is an easier problem to solve than listener mass adoption of BitCoin/Lightning.

Adam is betting that Podcasting might be the gateway to mass adoption of BitCoin and Lightning and if he’s going to have a chance of self-realising that bet, he will need the word spread far and wide to drive that outcome.

As of time of writing I have created a Causality Sphinx Tribe for those that wish to contribute by listening or via Boosting. It’s already had a good response and I’m grateful to those that are supporting Causality via that means or any other for that matter.

Discoverability

This is by far the biggest problem to solve and if we don’t improve it dramatically, the only people and content that will be ‘findable’ will be that of the big names with big budgets/networks behind them, leaving the better creators without such backing, left lacking. It should be just as easy to find an independent podcast with amazing content made by one person as it is to find a multi-million dollar podcast made by an entire production company. (And if the independent show has better content, then the Sats should flow to them…)

Current efforts are focussed on the addition of better tags in the Podcasting NameSpace to allow automated and manual searches for relevant content, and to add levers to improve promotability of podcasts.

They are sensibly splitting the namespace into Phases, each Phase containaing a small group of tags and progressively agreeing several tags at a time with the primary focus of closing out one Phase of tags before embarking on too much detail for the next. The first phase (now released) included the following:

  • < podcast:locked > (Technically not discoverability) If ‘yes’ the podcast platform is NOT permitted to be imported. This needs to be implemented by all platforms (or as many as possible) to be effective in preventing podcast theft which is rampant on platforms like Anchor aka Spotify
  • < podcast:transcript > A link to an episode transcript file
  • < podcast:funding > (Technically not discoverability) Link to the approved funding page/method (in my case Patreon)
  • < podcast:chapters > A server-side JSON format for chapters that can be static or collaborative (more below)
  • < podcast:soundbite > Link to one or more excerpts from the episode for a prospective listener to check out the episode before downloading or streaming the whole episode from the beginning

I’ve implemented those that I see as having a benefit for me, which is all of them (soundbite is a WIP for Causality), with the exception of Chapters. The interesting opportunity that Adam puts forward with chapters is he wants the audience to be able to participate with crowd-sourced chapters as a new vector of audience participation and interaction with podcast creators. They’re working with HyperCatcher’s developer to get this working smoothly but for now at least I’ll watch from a safe distance. I think I’m just too much of a control freak to hand that out on Causality to others to make chapter suggestions. That said it could be a small time saver for me for Pragmatic…maybe.

The second phase (currently a work in progress) is tackling six more:

  • < podcast:person > List of people that are on an episode or the show as a whole, along with a canonical reference URL to identify them
  • < podcast:location > The location of the focus of the podcast or episodes specific content (for TEN, this only makes sense for Causality)
  • < podcast:season > Extension of the iTunes season tag that allows a text string name in addition to the season number integer
  • < podcast:episode > Modification of the iTunes episode tag that allows non-integer values including decimal and alpha-numeric
  • < podcast:id > Platforms, directories, hosts, apps and services this podcast is listed on
  • < podcast:social > List of social media platform/accounts for the podcast/episode

Whilst there are many more in Phase 3 which is still open, the most interesting is the aforementioned < podcast:value > where the podcaster can provide a Lightning Node ID for payment using the KeySend protocol.

TEN Makes It Easy

This is my “that’s fine for John” moment, where I point out that me incorporating these into the fabric of The Engineered Network website hasn’t taken too much effort. TEN runs on GoHugo as a static site generator and whilst it was based on a very old fork of Castanet, I’ve re-written and extended so much of that now that’s not recognisable.

I already had people name tagging, people name files, funding, subscribe-to links on other platforms and social media tags and transcripts (for some episodes) already in the MarkDown YAML front-matter and templates so adding them into the RSS XML template was extremely quick and easy and required very little additional work.

The most intensive tags are those that require additional Meta-Data to make them work. Namely, Location only makes sense to implement on Causality, but it took me about four hours of Open Street Map searching to compile about 40 episode-locations worth of information. The other one is soundbite (WIP) where searching for one or more choice quotes retrospectively is time-consuming.

Not everyone out there is a developer (part or full-time) and hence rely on services to support these tags. There’s a relatively well maintained list at Podcast Index and at time of writing: Castopod, BuzzSprout, Fireside, Podserve and Transistor support one or more tags, with Fireside (thank you Dan!) supporting an impressive six of them: Transcript, Locked, Funding, Chapters, Soundbite and Person.

Moving Forward

I’ve occasionally chatted with the lovely Dave Jones on the Fediverse (Adam’s co-host and the developer working on many aspects of 2.0) and listen to 2.0 via Sphinx when I can (unfortunately I can’t on my mobile/iPad as the app has been banned by my company’s remote device management policy) and I’ve implemented the majority of their proposed tags thus far on my shows. I’m also in the process of setting up my own private BitCoin/Lightning Node.

For the entire time I’ve been involved in the podcasting space, I’ve never seen a concerted effort like this take place. It’s both heartening and exciting and feels a bit like the early days of Twitter (before Jack Dorsey went public, bought some of the apps and effectively killed the rest and pushed the algorithmic timeline thus ruining Twitter to an extent). It’s a coalition of concerned creators, collaborating to create a better outcome for future podcast creators.

They’ve seen where podcasting has come from, where it’s going and if we get involved we can help deliver our own destiny and not leave it in the hands of corporations with questionable agendas to dictate.