TechDistortion Articles https://techdistortion.comarticles en john@techdistortion.com Copyright 2009-2021 2021-01-20T02:32:10+00:00 TechDistortion Articles Wed, 20 Jan 2021 02:32:10 GMT Podcasting 2.0 Addendum https://techdistortion.com/articles/podcasting-2-0-addendum https://techdistortion.com/articles/podcasting-2-0-addendum Podcasting 2.0 Addendum I recently wrote about Podcasting 2.0 and thought I should add a further amendment regarding their goals. I previously wrote:

To solve the problems above there are a few key angles being tackled: Search, Discoverability and Monetisation.

I’d like to add a fourth key angle to that, which I didn’t think at the time should be listed as it’s own however having listened more to Episodes 16 and 17 and their intention to add XML tags for IRC/Chat Room integration I think I should add the fourth key angle: Interactivity.

Interactivity

The problem with broadcast historically is that audience participation is difficult given the tools and effort required. Pick up the phone, make a call, you need a big incentive (think cash prizes, competitions, discounts, something!) or audiences just don’t participate. It’s less personal and with less of a personal connection the desire for listeners to connect is much less.

In podcasting as an internet-first application and being far more personal, the bar is set differently and we can think of real-time feedback methods as verbal via a dial-in/patch-through to the live show or written via messaging, like a chat room. There are also non-real-time methods predominantly via webforms and EMail. With contact EMails already in the RSS XML specification, adding a webform submission entry might be of some use (perhaps < podcast:contactform > with a url=“https://contact.me/form"), but real-time is far more interesting.

Real Time Interactivity

In podcasting initially (like so many internet-first technology applications) geeks that understood how it works, led the way. That is to say with podcasts originally there was a way for a percentage of the listeners to use IRC as a Chat Room (Pragmatic did this for the better part of a year in 2014, as well as other far more popular shows like ATP, Back To Work etc.) to get real-time listener interaction during a podcast recording, albeit with a slight delay between audio out and listener response in the chat room.

YouTube introduced live streaming and live chat with playback that integrated the chat room with the video content to lower the barrier of entry for their platform. For equivalent podcast functionality to go beyond the geek-% of the podcast listeners, podcast clients will need to do the same. In order for podcast clients to be pressured to support it, standardisation of the XML tags and backend infrastructure is a must.

The problem with interactivity is that whilst it starts with the tag, it must end with the client applications otherwise only the geek-% of listeners will use it as they do now.

From my own experiences with live chat rooms during my own and other podcasts, people that are able to tune in to a live show and be present (lots of people just “sit” in a channel and aren’t actually present) is about 1-2% of your overall downloads and that’s for a technical podcast with a high geek-%. I’ve also found there are timezone-effects such that if you podcast live during different times of the day or night directly impacts those percentages even further (it’s 1am somewhere in the world right now, so if your listeners live in that timezone chances are they won’t be listening live).

The final concern is that chat rooms only work for a certain kind of podcast. For me, it could only potentially work with Pragmatic and in my experience I wanted Pragmatic to be focussed and chat rooms turned out to be a huge distraction. Over and again my listeners reiterated that one of the main attractions of podcasts was their ability to time-shift and listen to them when they wanted to listen to them. Being live to them was a minus not a plus.

For these reasons I don’t see that this kind of interactivity will uplift the podcasting ecosystem for the vast majority of podcasters, though it’s certainly nice to have and attempt to standardise.

Crowd-sourced Chapters

Previously I wrote:

The interesting opportunity that Adam puts forward with chapters is he wants the audience to be able to participate with crowd-sourced chapters as a new vector of audience participation and interaction with podcast creators.

Whilst I looked at this last time from a practical standpoint of “how would I as a podcaster use this?” concluding that I wouldn’t use it since I’m a self-confessed control-freak, but I didn’t fully appreciate the angle of audience interaction. I think for podcasts that have a truly significant audience with listeners that really want to help out (but can’t help financially) this feature provides a potential avenue to assist in a non-financial aspect, which is a great idea.

Crowd-source Everything?

(Except recording the show!)

From pre-production to post-production any task in the podcast creation chain could be outsourced to an extent. The pre-production dilemma could look like a feed level XML Tag < podcast:proposedtopics > to a planned topic list (popular podcasts currently use Twitter #Tags like #AskTheBobbyMcBobShow), to cut-out centralised platforms like Twitter from the creation chain in the long term. Again, only useful for certain kinds of shows, but could also include a URL Link to a shared document (probably a JSON file), an episode index reference (i.e. Currently released episode is 85, proposed topics for Episode 86, could also be an array for multiple episodes.)

The post-production dilemma generally consists of show notes, chapters (solution in progress) and audio editing. Perhaps a similar system to crowd-sourced chapters could be used for show notes that could include useful/relevant links for the current episode that aren’t/can’t be easily embedded as Chapter markers.

In either case there’s no reason why it couldn’t work the same way as crowd-sourced chapter markers. The podcaster could also have (with sufficient privileges) the administrative access to add/modify remove content from either of these, with guests also having read/write access. With an appropriate client tool this would then eliminate the plethora of different methods in use today: shared google documents being quite popular with many podcasters today, will not be around indefinitely.

All In One App?

Of course the more features we pile into the Podcasting client app, the more difficult it becomes to write and maintain. Previously an excellent programmer, come podcaster, come audiophile like Marco Arment, could create Overcast. With lightning network integration, plus crowd-sourced chapters, shared document support (notes etc) and a text chat client (IRC) the application is quickly becoming much heavier and complex, with fewer developers with the knowledge in each dimension to create an all-in-one app client.

The need for better frameworks to make feature integration easier for developers is obvious. There may well be the need to two classes of app or at least two views: the listener view and the podcaster view, or simply multiple apps for different purposes. Either way it’s interesting to see where the Tag + Use Case + Tool-chain can lead us.

]]>
Podcasting 2021-01-01T12:15:00+10:00 #TechDistortion
Podcasting 2.0 https://techdistortion.com/articles/podcasting-2-0 https://techdistortion.com/articles/podcasting-2-0 Podcasting 2.0 I’ve been podcasting from close to a decade and whilst I’m not what some might refer to as the “Old Guard” I’ve come across someone that definitely qualifies as such: Adam Curry.

Interestingly when I visited Houston in late 2019 pre-COVID19 my long-time podfriend Vic Hudson suggested I catch up with Adam as he lived nearby and referred to him as the “Podfather.” I had no idea who Adam was at that point and thought nothing of it at the time and although I caught up with Manton Reece at the IndieWeb Meetup in Austin I ran out of time for much else. Since then a lot has happened and I’ve come across Podcasting 2.0 and thus began my somewhat belated self-education of my pre-podcast-involvement podcasting history of which I had clearly been ignorant until recently.

In the first episode of Podcasting 2.0, “Episode 1: We are upgrading podcasting” on the 29th of August, 2020 at about 17 minutes in, Adam regales the story of when Apple and Steve Jobs wooed him with regards to podcasting as he handed over his own Podcast Index as it stood at the time to Apple as the new custodians. He refers to Steve Jobs' appearance at D3 and at 17:45, Steve defined podcasting as being iPod + Broadcasting = Podcasting, further describing it as “Wayne’s World for Podcasting” and even plays a clip of Adam Curry complaining about the unreliability of his Mac.

The approximate turn of events thereafter: Adam hands over podcast index to Apple, Apple builds podcasting into iTunes and their iPod line up and become the largest podcast index, many other services launch but indies and small networks dominate podcasting for the most part but for the longest time Apple didn’t do much at all to extend podcasting. Apple added a few RSS Feed namespace tags here and there but did not attempt to monetise Podcasting even as many others came into the Podcasting space, bringing big names from conventional media and with them many companies starting or attempting to convert podcast content into something that wasn’t as open as it had been with “exclusive” pay-for content.

What Do I Mean About Open?

To be a podcast by its original definition it must contain an RSS Feed, that can be hosted on any machine serving pages to the internet, readable by any other machine on the internet with an audio tag referring to an audio file that can be streamed or downloaded by anyone. A subscription podcast requires login credentials of some kind, usually associated with a payment scheme, in order to listen to the audio of those episodes. Some people draw the line at free = open (and nothing else), others are happy with the occasional authenticated feed that’s still available on any platform/player as that still presents an ‘open’ choice, but much further beyond that (won’t play in any player, not everyone can find/get the audio) and things start becoming a bit more closed.

Due to their open nature, tracking of podcast listeners, demographics and such is difficult. Whilst advertisers see this as a minus, most privacy conscious listeners see this as a plus.

Back To The History Lesson

With big money and big names a new kind of podcast emerged, one behind a paywall with features and functionality that other podcast platforms didn’t or couldn’t have with a traditional open podcast using current namespace tags. With platforms scaling and big money flowing into podcasting, it effectively brought down the average ad-revenue across the board in podcasting and introduced more self-censorship and forced-censorship of content that previously was freely open.

With Spotify and Amazon gaining traction, more multi-million dollar deals and a lack of action from Apple, it’s become quite clear to me that podcasting as I’ve known it in the past decade is in a battle with more traditional, radio-type production companies with money from their traditional radio, movie and music businesses behind them. The larger the more closed podcast eco-systems become, the harder it then becomes for those that aren’t anointed by those companies as being worthy, to be heard amongst them.

Advertisers instead of spending time and energy with highly targeted advertising by carefully selecting shows (and podcasters) individually to attract their target demographic, instead they start dealing only with the bigger companies in the space since they want demographics from user tracking with bigger companies claiming a large slice of the audience they then over-sell their ad-inventory leading to lower-value DAI and less-personal advertising further driving down ad-revenues.

(Is this starting to sound like radio yet? I thought podcasting was supposed to get us away from that…)

Finally another issue emerged: that of controversial content. What one person finds controversial another person finds acceptable. With many countries around the world, each with different laws regarding freedom of speech and with people of many different belief systems, having a way to censor content with a fundamentally open ecosystem (albeit with partly centralised search) was a lever that would inevitably be pulled at some point.

As such many podcasts have been removed from different indexes/directories for different reasons, some more valid than others perhaps, however that is a subjective measure and one I don’t wish to debate here. If podcasts are no longer open then their corporate controller can even more easily remove them in part or in whole as they control both the search and the feed.

To solve the problems above there are a few key angles being tackled: Search, Discoverability and Monetisation.

Search

Quick and easy, the Podcast Index is a complete list of any podcast currently available that’s been submitted. It isn’t censored and is operated and maintained by the support of it’s users. As it’s independent there is no hierarchy to pressure the removal of content from it.

Monetisation

The concept here is ingenuous but requires a leap of faith (of a sort). Bitcoin or rather Lightning, which is a micro-transaction layer that sits aside Bitcoin. If you are already au fait with having a Bitcoin Node, Lightning Node and Wallet then there’s nothing for me to add but the interesting concept is this: by submitting your Node address in your Podcast RSS feed (using the podcast:value tag) a compliant Podcast player can then optionally use the KeySend Lightning command to send a periodic payment “as you listen.” It’s voluntary but it’s seamless.

The podcaster sets a suggested rate in Sats (Satoshis) per minute of podcast played (recorded minute - not played minute if you’re listening at 2x, and the rate is adjustable by the listener) to directly compensate the podcast creator for their work. You can also “Boost” and provide one-off payments via a similar mechanism to support your podcast creator.

The transactions are so small and carry such minimal transaction fees that effectively the entire amount is transferred from listener to podcaster without any significant middle-person skimming off the top in a manner that both reflects the value in time listened vs created and without relying on a single piece of centralised infrastructure.

Beyond this the podcaster can choose additional splits for the listener streaming Sats to go to their co-hosts, to the podcast player app-developer and more. Imagine being able to directly compensate audio editors, artwork contributors, hosting providers all directly and fairly based on listeners actually consuming the content in real time.

This allows a more balanced value distribution and protects against the current non-advertising podcast-funding model via a support platform like Patreon and Patreon (oh I mean Memberful but that’s actually Patreon ). When Patreon goes out of business all of those supportive audiences will be partly crippled as their creators scramble to migrate their users to an alternative. The question is will it be another centralised platform or service, or a decentralised system like this?

That’s what’s so appealing about the Podcasting 2.0 proposition: it’s future proof, balanced and sensible and it avoids the centralisation problems that have stifled creativity in the music and radio industries in the past. There’s only one problem and it’s a rather big one: the lack of adoption of Lightning and Bitcoin. Currently only Sphinx supports podcast KeySend at the time of publishing and adding more client applications to that list of one is an easier problem to solve than listener mass adoption of BitCoin/Lightning.

Adam is betting that Podcasting might be the gateway to mass adoption of BitCoin and Lightning and if he’s going to have a chance of self-realising that bet, he will need the word spread far and wide to drive that outcome.

As of time of writing I have created a Causality Sphinx Tribe for those that wish to contribute by listening or via Boosting. It’s already had a good response and I’m grateful to those that are supporting Causality via that means or any other for that matter.

Discoverability

This is by far the biggest problem to solve and if we don’t improve it dramatically, the only people and content that will be ‘findable’ will be that of the big names with big budgets/networks behind them, leaving the better creators without such backing, left lacking. It should be just as easy to find an independent podcast with amazing content made by one person as it is to find a multi-million dollar podcast made by an entire production company. (And if the independent show has better content, then the Sats should flow to them…)

Current efforts are focussed on the addition of better tags in the Podcasting NameSpace to allow automated and manual searches for relevant content, and to add levers to improve promotability of podcasts.

They are sensibly splitting the namespace into Phases, each Phase containaing a small group of tags and progressively agreeing several tags at a time with the primary focus of closing out one Phase of tags before embarking on too much detail for the next. The first phase (now released) included the following:

  • < podcast:locked > (Technically not discoverability) If ‘yes’ the podcast platform is NOT permitted to be imported. This needs to be implemented by all platforms (or as many as possible) to be effective in preventing podcast theft which is rampant on platforms like Anchor aka Spotify
  • < podcast:transcript > A link to an episode transcript file
  • < podcast:funding > (Technically not discoverability) Link to the approved funding page/method (in my case Patreon)
  • < podcast:chapters > A server-side JSON format for chapters that can be static or collaborative (more below)
  • < podcast:soundbite > Link to one or more excerpts from the episode for a prospective listener to check out the episode before downloading or streaming the whole episode from the beginning

I’ve implemented those that I see as having a benefit for me, which is all of them (soundbite is a WIP for Causality), with the exception of Chapters. The interesting opportunity that Adam puts forward with chapters is he wants the audience to be able to participate with crowd-sourced chapters as a new vector of audience participation and interaction with podcast creators. They’re working with HyperCatcher’s developer to get this working smoothly but for now at least I’ll watch from a safe distance. I think I’m just too much of a control freak to hand that out on Causality to others to make chapter suggestions. That said it could be a small time saver for me for Pragmatic…maybe.

The second phase (currently a work in progress) is tackling six more:

  • < podcast:person > List of people that are on an episode or the show as a whole, along with a canonical reference URL to identify them
  • < podcast:location > The location of the focus of the podcast or episodes specific content (for TEN, this only makes sense for Causality)
  • < podcast:season > Extension of the iTunes season tag that allows a text string name in addition to the season number integer
  • < podcast:episode > Modification of the iTunes episode tag that allows non-integer values including decimal and alpha-numeric
  • < podcast:id > Platforms, directories, hosts, apps and services this podcast is listed on
  • < podcast:social > List of social media platform/accounts for the podcast/episode

Whilst there are many more in Phase 3 which is still open, the most interesting is the aforementioned < podcast:value > where the podcaster can provide a Lightning Node ID for payment using the KeySend protocol.

TEN Makes It Easy

This is my “that’s fine for John” moment, where I point out that me incorporating these into the fabric of The Engineered Network website hasn’t taken too much effort. TEN runs on GoHugo as a static site generator and whilst it was based on a very old fork of Castanet, I’ve re-written and extended so much of that now that’s not recognisable.

I already had people name tagging, people name files, funding, subscribe-to links on other platforms and social media tags and transcripts (for some episodes) already in the MarkDown YAML front-matter and templates so adding them into the RSS XML template was extremely quick and easy and required very little additional work.

The most intensive tags are those that require additional Meta-Data to make them work. Namely, Location only makes sense to implement on Causality, but it took me about four hours of Open Street Map searching to compile about 40 episode-locations worth of information. The other one is soundbite (WIP) where searching for one or more choice quotes retrospectively is time-consuming.

Not everyone out there is a developer (part or full-time) and hence rely on services to support these tags. There’s a relatively well maintained list at Podcast Index and at time of writing: Castopod, BuzzSprout, Fireside, Podserve and Transistor support one or more tags, with Fireside (thank you Dan!) supporting an impressive six of them: Transcript, Locked, Funding, Chapters, Soundbite and Person.

Moving Forward

I’ve occasionally chatted with the lovely Dave Jones on the Fediverse (Adam’s co-host and the developer working on many aspects of 2.0) and listen to 2.0 via Sphinx when I can (unfortunately I can’t on my mobile/iPad as the app has been banned by my company’s remote device management policy) and I’ve implemented the majority of their proposed tags thus far on my shows. I’m also in the process of setting up my own private BitCoin/Lightning Node.

For the entire time I’ve been involved in the podcasting space, I’ve never seen a concerted effort like this take place. It’s both heartening and exciting and feels a bit like the early days of Twitter (before Jack Dorsey went public, bought some of the apps and effectively killed the rest and pushed the algorithmic timeline thus ruining Twitter to an extent). It’s a coalition of concerned creators, collaborating to create a better outcome for future podcast creators.

They’ve seen where podcasting has come from, where it’s going and if we get involved we can help deliver our own destiny and not leave it in the hands of corporations with questionable agendas to dictate.

]]>
Podcasting 2020-12-29T15:25:00+10:00 #TechDistortion
Oh My NAS https://techdistortion.com/articles/oh-my-nas https://techdistortion.com/articles/oh-my-nas Oh My NAS I’ve been on the receiving end of failing hard drives in the past and lost many of my original podcast source audio files and more importantly a years' worth of home videos, gone forever.

Not wishing for a repeat of this I purchased an 8TB external USB HardDrive and installed BackBlaze. The problem for me though was that BackBlaze was an ongoing expense, could only be used for a single machine and couldn’t really do anything other than be an offsite backup. I’d been considering a Network Attached Storage for years now and the thinking was, if I had a NAS then I could have backup redundancy1 plus a bunch of other really useful features and functionality.

The trigger was actually a series of crashes and disconnects of the 8TB USB HDD, and with the OS’s limited ability to troubleshoot HDD hardware-specific issues via USB I had some experience from my previous set of HDD failures many years ago, that this is how it all starts. So I gathered together a bunch of smaller HDDs and copied across all the data to them, while I still could, and resolved to get a better solution: hence the NAS.

Looking at both QNAP and Synology and my desire to have as broad a compatibility as possible, I settled on an Intel-based Synology, which in Synology-speak, means a “Plus” model. Specifically the DS918+ presented the best value for money with 4 Bays and the ability to extend with a 5 Bay external enclosure if I really felt the need in future. I waited until the DS920+ was released and noted that the benchmarks on the 920 weren’t particularly impressive and hence I stuck with the DS918+ and got a great deal as it had just become a clearance product to make way for the new model.

My series of external drives I had been using to hold an interim copy of my data were: a 4TB 3.5", a 4TB 2.5" (at that time I thought it was a drive in an enclosure you could extract), and a 2TB 3.5" drive as well as, of course, my 8TB drive which I wasn’t sure was toast yet. The goal was to reuse as many of my existing drives as possible and not spend even more money on more, new HDDs. I’d also given a disused but otherwise healthy 3.5" 4TB drive to my son for his PC earlier in the year and he hadn’t actually used it, so I reclaimed it temporarily for this exercise.

Here’s how it went down:

STEP 1: Insert 8TB Drive and in Storage Manager, Drive Info, run an Extended SMART test…and hours later…hundreds of bad sectors. To be honest, that wasn’t too surprising since the 8TB drive was periodically disconnecting and reconnecting and rebuilding its file tables - but now I had the proof. The Synology refused to let me create a Storage Pool or a Volume or anything so I resigned myself to buying 1 new drive: I saw that SeaGate Barracudas were on sale so I grabbed one from UMart and tried it.

STEP 2: Insert new 4TB Barracuda and in Storage Manager, Drive Info, run an Extended SMART test…and hours later…it worked perfectly! (As you’d expect) Though the test took a VERY long time, I was happy so I created a Storage Pool, Synology Hybrid RAID. Created a Volume, BTRFS because it came highly recommended, and then began copying over the first 4TB’s worth of data to the new Volume. So far, so good.

STEP 3: Insert my son’s 4TB drive and extend the SHR Storage Pool to include it. The Synology allowed me to do this and I did so for some reason without running a SMART Extended test on it first, and it let me so that should be fine right? Turns out, this was a terrible idea.

STEP 4: Once all data was copied off the 4TB data drive and to the Synology Volume, wipe that drive, extract the 3.5" HDD and insert the reclaimed 4TB 3.5" into the Synology and in Storage Manager, Drive Info, run an Extended SMART test…and hours later…hundreds of bad sectors. Um, okay. That’s annoying. So I might be up for yet another HDD since I have 9TB to store.

OH DEAR MOMENT: As I was re-running the drive check the Synology began reporting that the Volume was Bad, and the Storage Pool was unhealthy. I looked into the HDD manager and saw that my sons reclaimed 3.5" drive was also full of bad sectors, as the Synology had run a periodic test while data was still copying. I also attempted to extract the 2.5" drive from the external enclosure only to discover that it was a fully integrated controller/drive/enclosure and couldn’t be extracted without breaking it. (So much for that) Whilst I still had a copy of my 4TB of data in BackBlaze at this point I wasn’t worried about losing data but the penny dropped: Stop trying to save money and just buy the right drives. So I went to Computer Alliance and bought three shiny new 4TB SeaGate IronWolf drives.

STEP 5: Insert all three new 4TB IronWolfs and in Storage Manager, Drive Info, run an Extended SMART test…and hours later…the first drive perfect! The second and third drives however…had bad sectors. Bad sectors. On new drives? And not only that NAS-specific, high reliability drives? John = not impressed. I extended the Storage Pool (Barracuda + 1 IronWolf) and after running a Data Scrub it still threw up errors despite the fact both drives appeared to be fine and were brand new.

IronWolf Fail This is not what you want to see on a brand new drive…

TROUBLESHOOTING:

So I did what all good geeks do and got out of the DSM GUI and hit SSH and the Terminal. I ran “btrfs check –repair” and recover, super-recover and chunk-recover and ultimately the chunk tree recovery failed. I read that I had to stop everything running and accessing the Pool so I painstakingly killed every process and re-ran the recovery but ultimately it still failed after a 24 hour long attempt. There was nothing for it - it was time to start copying the data that was on there (what I could read) back on to a 4TB external drive and blow it all away and start over.

Chunk Fail

STEP 6: In the midst of a delusion that I could still recover the data without having to recopy the lot of it off the NAS (a two day exercise), I submitted a return request for first failed IronWolf, while I re-ran the SMART on the other potentially broken drive. The return policy stated that they needed to test the HDD and that could take a day or two and Computer Alliance is a two hour round trip from my house. Fortunately I met a wonderfully helpful and accomodating support person at CA on that day and he simply handed me a replacement, taking the Synology screenshot of the bad sector count and serial number confirming I wasn’t pulling a switch on him and handed me a replacement IronWolf on the spot! (Such a great guy - give him a raise) I returned home, this time treating the HDD like a delicate egg the whole trip, inserted it and in Storage Manager, Drive Info, run an Extended SMART test…and hours later…perfect!

STEP 7: By this time I’d given up all hope of recovering the data and with three shiny new drives in the NAS, my 4TB of original data restored to my external drive (I had to pluck 5 files that failed to copy back from my BackBlaze backup) I wiped all the NAS drives…and started over. Not taking ANY chances I re-ran the SMART tests on all three and when they were clean (again) recreated the Pool, new Volume, and started copying my precious data back on to the NAS all over again.

STEP 8: I went back to Computer Alliance to return the second drive and this time I met a different support person, someone who was far more “by the book” and accepted the drive and asked me to come back another day once they’d tested it. I’d returned home and hours later they called and said “yeah it’s got bad sectors…” (you don’t say?) and unfortunately due to personal commitments I couldn’t return until the following day. I grabbed the replacement drive, drove on eggshells, added it to the last free bay and in Storage Manager, Drive Info, run an Extended SMART test…and hours later…perfect! (FINALLY)

STEP 9: I copied all of the data across from all of my external drives on to the Synology. The Volume was an SHR with 10.9TB of usable space spread across x4 4TB drives, (x3 IronWolf, and x1 Barracuda). The Data Scrub passed, the SMART Tests passed, and the IronWolf-specific Health Management tests all passed with flying colours (all green, oh yes!) It was time to repurpose the 4TB 2.5" external drive as my offline backup for the fireproof safe. I reformatted it to ExFAT and set up HyperBackup for my critical files (Home Videos, Videos of the Family, my entire photo library), backed them up and put that in the safe.

CONCUSION:

Looking back the mistake was that I never should have extended the storage pool before the Synology had run a SMART test and flagged the bad sectors. In so doing it wrote data to those bad sectors and there were just too many for BTRFS to recover in the end. In addition I never should have tried to do this on the cheap. I should have just bought new drives from the get-go. Not only that, I should have just bought NAS-specific drives from the get-go as well. Despite the bad sectors and bad luck of getting two out of three bad IronWolf drives, in the end they have performed very well and completed their SMARTs faster with online forums suggesting a desktop-class HDD (the Barracuda) is a bad choice for a NAS. I now have my own test example to see if the Barracuda is actually suitable as a long-term NAS drive, since I ended up with both types in the same NAS, same age, same everything else, so I’ll report back in a few years to see which starts failing first.

Ultimately I also stopped using BackBlaze. It was slowing down my MacBook Pro, I found video compression on data recovery that was frustrating, and even with my 512GB SSD on the MBP with everything on it, I would often get errors about a lack of space for backups to BackBlaze. Whilst financially the total lifecycle cost of the NAS and the drives is far more than BackBlaze (or an equivalent backup service) would cost me, the NAS can also do so many more things, than just to backup my data via TimeMachine.

But that’s another story for another article. In the end the NAS plus drives cost me $1.5k AUD, 6 trips to two different computer stores and 6 weeks from start to finish, but it’s been running now since August 2020 and hasn’t skipped a beat. Oh…my…NAS.


  1. Redundancy against the failure of an individual HDD ↩︎

]]>
Technology 2020-11-29T09:00:00+10:00 #TechDistortion
200-500mm Zoom Lens Test https://techdistortion.com/articles/200-500-zoom-lens-test https://techdistortion.com/articles/200-500-zoom-lens-test 200-500mm Zoom Lens Test I’ve been exploring my new 200-500mm Nikon f/5.6 Zoom Lens on my D500 and pushing the limits of what it can do. I’ve used it for several weeks taking photos of Soccer and Cricket and I thought I should run a few of my own lens sharpness tests to see how it’s performing in a controlled environment.

As in my previous Lens Shootout I tested sharpness indoors, with controlled lighting conditions setting the D500 on a tripod, set with a Timer and adjusting the shutter speed leaving a constant shutter speed of 1/200th of a second, with Auto ISO and tweaked the Exposure during post to try and equalise the light level between exposures.

Setting the back of some packaging with a mixture of text and symbols as the target with the tripod at the same physical distance for each test photo.

Nikon 200-500mm Zoom Lens

I took photos across the aperture range at f/5.6, f/8 and f/11, cropped to 1,000 x 1,000 pixels in both the dead-center of the frame and the bottom-right edge of the frame.


200mm

200mm Edge f/5.6 200mm Center Crop f/5.6

200mm Edge f/8 200mm Center Crop f/8

200mm Edge f/11 200mm Center Crop f/11

200mm Edge f/5.6 200mm Edge Crop f/5.6

200mm Edge f/8 200mm Edge Crop f/8

200mm Edge f/11 200mm Edge Crop f/11


300mm

300mm Edge f/5.6 300mm Center Crop f/5.6

300mm Edge f/8 300mm Center Crop f/8

300mm Edge f/11 300mm Center Crop f/11

300mm Edge f/5.6 300mm Edge Crop f/5.6

300mm Edge f/8 300mm Edge Crop f/8

300mm Edge f/11 300mm Edge Crop f/11


400mm

400mm Edge f/5.6 400mm Center Crop f/5.6

400mm Edge f/8 400mm Center Crop f/8

400mm Edge f/11 400mm Center Crop f/11

400mm Edge f/5.6 400mm Edge Crop f/5.6

400mm Edge f/8 400mm Edge Crop f/8

400mm Edge f/11 400mm Edge Crop f/11


500mm

500mm Edge f/5.6 500mm Center Crop f/5.6

500mm Edge f/8 500mm Center Crop f/8

500mm Edge f/11 500mm Center Crop f/11

500mm Edge f/5.6 500mm Edge Crop f/5.6

500mm Edge f/8 500mm Edge Crop f/8

500mm Edge f/11 500mm Edge Crop f/11


What I wanted to test the most was the differences between Edge and Centre sharpness as well as the the effect of different Apertures. For me I think the sensor is starting to battle ISO grain at f/11 and this is impacting the apparent sharpness. In the field I’ve tried stopping down the Aperture to try and get a wider focus across the zoom area but it’s tough the further out you zoom and the images above support this observation.

My conclusions in terms of the questions I was seeking answers to though, is firstly there’s no noticable change in sharpness from the center to the edge at the closest zoom, irrespective of aperture. The edge starts to softens only slightly as you zoom in towards 500mm, and is independent of aperture.

The thing I didn’t expect was the sharpness at f/5.6 being so consistent, throughout the zoom range. If you’re isolating a subject at the extremes of zoom then it’s probably not worth stopping down the aperture and in future when I’m shooting I’ll just keep that aperture as wide open as I can unless I’m at the 200mm end of the zoom spectrum.

It’s a truly amazing lens for the money and whilst I realise there are many other factors to consider, I at least answered my own questions.

]]>
Photography 2020-10-25T06:00:00+10:00 #TechDistortion
Astronomy With Zoom Lenses https://techdistortion.com/articles/astronomy-with-zoom-lenses https://techdistortion.com/articles/astronomy-with-zoom-lenses Astronomy With Zoom Lenses About a month ago I started renting a used Nikon 200-500mm Zoom Lens that was in excellent condition. Initially my intention was to use it for photographing the kids playing outdoor sports, namely Soccer, Netball and Cricket. Having said that the thought occurred to me that it would be excellent for some Wildlife photography, here, here and here, and also…Astrophotography.

Nikon 200-500mm Zoom Lens

I was curious just how much I could see with my D500 (1.5x as it’s a DX Crop-sensor) using the lens at 500mm maximum (750mm effective). The first step was to mount my kit on my trusty 20 year old, ultra-cheap, aluminium tripod. Guess what happened?

The bracket that holds the camera to the tripod base snapped under the weight of the lens and DSLR and surprising even myself, in the pitch dark, I miraculously caught them before they hit the tiles, by mere inches. Lucky me, in one sense, not so lucky in another - my tripod was now broken.

Not to be defeated, I applied my many years of engineering experience to strap it together with electrical tape…because…why not?

D500 and 200-500 Zoom on Tripod

Using this combination I attempted several shots of the heavens and discovered a few interesting things. My PixelPro wireless shutter release did not engage the Image Stabilisation in the zoom lens. I suppose they figured that if you’re using the remote, you’ve probably got a tripod anyhow so who needs IS? Well John does, because his Tripod was a piece of broken junk that was swaying in the breeze - no matter how gentle that breeze was…

Hence I ended up ditching the Tripod and opted instead for handheld, using the IS in the Zoom Lens. The results were (to me at least) pretty amazing!

Earth’s Moon

The Moon I photographed through all of its phases culminating in the above Full Moon image. By far the easier thing to take a photo of and in 1.3x crop mode on the D500 it practically filled the frame. Excellent detail and an amazing photograph.

Of course, I didn’t stop there. It was time to turn my attention to the planets and luckily for me several planets are at or near opposition at the moment. (Opposition is one of those astronomy terms I learned recently, where the planet appears at its largest and brightest, and is above the horizon for most of the night)

Planet Jupiter

Jupiter and its moons, the cloud band stripes are just visible in this photo. Stacked two images, one exposure of the Moons and one of Jupiter itself. No colour correction applied.

Planet Saturn

Saturn’s rings are just visible in this image.

Planet Mars

Mars is reddish and not as interesting unfortunately.

International Space Station

The ISS image above clearly shows the two large solar arrays on the space station.

What’s the problem?

Simple. It’s not a telescope…is the problem. Zoom Lenses are simply designed for a different purpose than maximum reach taking photos of planets. I’ve learned through research that the better option is to use a T-Ring adaptor and connect your DSLR to a telescope. If you’re REALLY serious you shouldn’t use a DSLR either since most have a red-light filter which changes the appearance of nebulae, you need to use a digital camera that’s specifically designed for Astrophotography (or hack your DSLR to remove it from some models if you’re crazy enough).

If you’re REALLY, REALLY interested in the best photos you can take, you need an AltAz or Altitude - Azimuth mount that automatically moves the camera in opposition to Earths rotation to keep the camera pointing in the same spot in the night sky for longer exposures. And if you’re REALLY, REALLY, REALLY serious you want to connect that to a guide scope that further ensures the auto-guided mount is tracking as precisely as possible. And if you’re REALLY, REALLY, REALLY, REALLY serious you’ll take many, many exposures including Bias Frames, Light Frames, Dark Frames, and Frame Frames and image stack them to reduce noise in the photo.

How Much Further Can You Go With a DSLR and Lenses?

Not much further, that’s for sure. I looked at adding Teleconverters, particularly the TC-14E (1.4x) and then a TC-20E (2x) which would give me an effective focal length of 1,050mm and 1,500mm respectively. The problem is that you lose a lot of light in the process and whilst you could get a passable photo at 1,050mm, with 1,500 on this lens you’re down to an aperture of f/11 which is frankly, terrible. Not only that but reports seem to indicate that coma (chromatic aberration) is pretty bad with the 2x Teleconverter coupled with this lens. The truth is that Teleconverters are meant for fast primes (f/4 or better) not a f/5.6 Zoom.

Going to an FX Camera Body wouldn’t help since you’d lose the 1.5x effective zoom from the DX sensor, although you might pick up a few extra pixels, the sensor on my D500 is pretty good in low light, so you’re not going to get a much better low-light sensor for this sort of imaging. (Interestingly the pixel density of the sensor between the D500 DX and D850 FX, leaves my camera with 6.7% more pixels per square cm so it’s still the better choice)

How Many Pixels Can You See?

Because I’m me, I thought let’s count some pixels. Picking Jupiter because it’s big, bright and easy to photograph (as planets go) with my current combination it’s 45 pixels across. Adding 1.4x Teleconverter gets me to an estimated 63 pixels, and 2.0x to 90 pixels diameter. Certainly that would be nicer, but probably still wouldn’t be enough detail to make out the red spot with any real clarity.

Just a Bit of Fun

Ultimately I wanted to see if it was possible to use my existing Camera equipment for Astronomy. The answer was: kinda, but don’t expect more than the Moon to look any good. If you want pictures somewhere between these above and what the Hubble can do, expect to spend somewhere between $10k –> $30k AUD on a large aperture, large focal length telescope, heavy duty AltAz mount, tracking system and specialised camera, and add in a massive dose of patience waiting for the clearest possible night too.

If nothing else for me at least, it’s reawakened a fascination that I haven’t felt since I was a teenager about where we sit in the Universe. With inter-planetary probes and the Hubble Space Telescope capturing amazing images, and CGI making it harder to pick real from not-real planets, suns and solar systems, it’s easy to become disconnected from reality. Looking at images of the planets in ultra-high resolution almost doesn’t feel as real as when you use your own equipment and see them with your own eyes.

So I’ve enjoyed playing around with this but not because I was trying to get amazing photographs. It’s been a chance to push the limits of the gear I have with me to see a bit more of our Solar System, completely and entirely on my own from my own backyard. And that made astronomy feel more real to me than it had for decades.

The stars, the moon, the planets and a huge space station that we humans built, are circling above our heads. All you need to do is look up…I’m really glad I took the time to do just that.

]]>
Technology 2020-10-17T08:00:00+10:00 #TechDistortion
Solo Band https://techdistortion.com/articles/solo-band https://techdistortion.com/articles/solo-band Solo Band Apple’s new Apple Watch Series 6 was released with several new bands, of which the two most controversial are the Solo Braid and Solo Sport Loop bands. Whilst the braided band might look nice, my instant reaction was “that’s going to catch on everything” and I’ve heard a few anecdotal reports floating around the internet recently of threads being pulled on these bands as some evidence to validate my ultimate choice not to get that one.

Whilst I applaud Apple’s “Create Your Style” watch and band selector, the fact you STILL can’t select a Nike band or a Hermes band with your new watch. (I know right? No Hermes? I guess there’s always a Hermes store for that…the bands are next to the riding helmets I hear…)

Per Apple’s directions when ordering, I dutifully printed out the measuring tape/paper cutout measurement implement to find my wrist size was between 6 and 7 - exactly half way. I opted for a 7 when I ordered, plain white then attended the Chermside Apple Store to pick it up at a scheduled time through their door / COVID19 “window” for pickups.

Once in hand I opened and hastily put it on the watch and my wrist only to find it was too loose. Logic being that this was going to probably stretch over time, I went back to the “window” to swap it for a Size 6, one size down. After attempting to return just the band, and failing, then trying multiple times to return the entire watch, just to swap the band, after nearly 45 minutes I had the right fitting band and was on my way.

I’m not sure I’m complaining exactly as everything is relative. There are other parts of the world where Apple Stores are still closed due to local COVID19 lockdown restrictions, so I had it good…for sure.

Solo Loop Edge Gap

The gap at the edge is quite small and tight, which is how I like to wear my watches. (I hate loose watches)

Solo Loop on Wrist

The band to the untrained eye looks just like a traditional White Sport Band.

Solo Loop Underneath

The giveaway is underneath where there is no pin, and ultimately the reason that I like this band so much more than any of the existing sport bands. On standard two-piece sport bands, the pin isn’t so much the issue, it’s the slide-under segment through the hole that pulls out arm hairs on the way and places pressure on my carpel tunnels after many hours of wearing. (Sure I could wear it more loosely, but refer above - I hate doing that)

Feel and Comfort

The solo loop band is softer than my White Sport Band and is elastic but firm. The rubber-like texture is balanced with a smooth finish so it doesn’t grab your arm hairs too much like a rubber-band would when you take it off or put it on.

Beyond this I’ve found that like the other sport bands it’s the best option when you get it wet as it’s quick and easy to dry.

I Really Wanted A Nike Sport Loop Though

I’ve been a huge fan of my nearly two year old Blue Sport Loop band so much so that I’ve worn it more than any other band during that time and it’s frayed at the loop-back buckle and generally a bit worse for wear.

I had secretly hoped that when Apple released the Series 6 they would open up the selector to include Nike bands as options, alas they did not. So after wearing the Solo Loop for a week, I went back to the Apple Store and grabbed the band I actually wanted: the Spruce Aura Sport Loop.

Solo Loop and Sport Loop Outside

Side by side the Pure White of the Solo Loop contrasts with the subtle Green weave of the Nike Sport Loop.

Solo Loop and Sport Loop Inside

The Nike Loop is made from the same material and is just as comfortable as my previous favourite comfortable band with the bonus of being a pleasant light colour that’s reflective in the dark.

Concerns with the Solo Loops

Much has been written about the Solo Loop being a bad customer experience and certainly with so many Apple Stores not functioning as they used to due to COVID19 restrictions, finding the best fit is more difficult than it otherwise would be. That said, were they open the truly best way to get a feel for the band comfort isn’t wearing it in the store for two minutes - you really need serious time with it in general use for a few days or weeks to know for sure if it will work for you in that size.

Notwithstanding this the other issue is resale. Previously you could sell your Apple Watch or hand it down to other family members but now the variable of “will it fit their wrist” needs to be considered. If not, you’re up for another solo band that fits the recipient or one with flexible sizing that fits anyone.

If you can look past these issues, then the solo sport loop is comfortable, simple and I think better than the other Sport Bands on offer. That said…I’ll be sticking with my recommendation for the Sport Loops as the best all-round band for the Apple Watch.

]]>
Technology 2020-10-16T21:00:00+10:00 #TechDistortion
Kit vs Tamron vs Prime Shoot-out https://techdistortion.com/articles/kit-vs-tamron-vs-prime-shootout https://techdistortion.com/articles/kit-vs-tamron-vs-prime-shootout Kit vs Tamron vs Prime Shoot-out I’ve been reading and learning, trying and fine-tuning my photography setup (heck, isn’t that what all photography enthusiasts do?) and I’ve been looking at the gaps in my lens arsenal and looking also for duplication and overlap.

I started out loving Zoom lenses with my 55-200mm Nikon providing most of the work for outdoors sports, but with two of the key kinds of sports photography I was being called on happening at night or indoors in poor lighting (Netball and Basketball) then I had to invest in a better zoom with the Tamron 24-70mm f/2.8 being my choice.

It does a fine job and did double-duty for large group shots where I didn’t have space to move back and needed to work in close, and using a DX camera (Nikon D5500 and then D500) the 24mm short end wasn’t quite short enough. I invested in a kit lens second-hand on a lark, thinking it could do fine at the short end (18mm) for those tight situations. Unfortunately I kept having trouble with the sharpness of both the 24-70mm Tamron and the 18-55mm Nikon kit lenses.

A thought occurred to me that I’d become spoilt by my growing prime collection (35mm f/1.8, 50mm f/1.8, 85mm f/1.8) which are sharp as a tack at pretty much every aperture. Then I read many, many semi-professional and professional lens reviews to try and decide if I was imagining things.

Then I thought, “Hey, I could just do my own test…”

…and here it is…

I decided to test their sharpness indoors, with controlled lighting conditions setting the D500 on a tripod, set with a Timer and adjusting the shutter speed leaving a constant ISO160. Setting the back of some packaging with a mixture of text and symbols as the target with the tripod at the same distance for each. The only variable I think I could have done better was the distance from the lens element to the target was slightly different owing to the different lens designs and resulting imprecision of the 50mm mark on each Zoom ensuring the exact same image scale in the frame, but it’s close enough to make the point.

Tamron 24-70mm f/2.8 / Nikon 18-55mm f/3.5-f/5.6 / Nikon 50mm f/1.8 Lenses Tamron 24-70mm f/2.8 (Left) | Nikon 18-55mm f/3.5-f/5.6 (Middle) | Nikon 50mm f/1.8 (Right)

Finally to match the apertures I took photos across the range with two equivalence points that were possible on all three lenses, at f/5.6 which is the widest open the 18-55mm Lens could go, and f/8 because…f-8 and be there, or something like that. Additionally I tried out f/2.8 to provide another point of comparison for between the 50mm and the Tamron.


Firstly the f/5.6 Shoot-out…

18-55mm Nikon at f/5.6 18-55mm Nikon at f/5.6

24-70mm Tamron at f/5.6 24-70mm Tamron at f/5.6

50mm Nikon at f/5.6 50mm Nikon at f/5.6


Secondly the f/8 Shoot-out…

18-55mm Nikon at f/8 18-55mm Nikon at f/8

24-70mm Tamron at f/8 24-70mm Tamron at f/8

50mm Nikon at f/8 50mm Nikon at f/8


There’s no question that the 18-55mm Kit Lens is the worst by an obvious margin than both the other lenses. That shouldn’t be a revelation to anyone, it’s the cheapest lens I tried and honestly…it shows.

What’s more interesting is the colour reproduction and the sharpness between the Tamron and the Prime. At f/8 I think the Tamron has better colour and is marginally sharper, but at f/5.6 it’s almost a wash. It’s easy to take the darker lines on the Tamron as the better representation but the Prime picked up the dust and imperfections in the printed lines and text slightly better, leading to a slightly lighter colour.


Finally the f/2.8 Shoot-out…

24-70mm Tamron at f/2.8 24-70mm Tamron at f/2.8

50mm Nikon at f/2.8 50mm Nikon at f/2.8


In the end the Tamron on balance seems slightly sharper than the 50mm Prime at 50mm, but there’s also the amount of light and colour on the Prime is better. So what’s the conclusion? Clearly the Tamron is a fantastic lens, but the 50mm is probably good enough at 50mm, so the question is why do I need both?

For me, personally, what is each lens really for? If I have a 50mm and 85mm Prime, then I don’t really need the Tamron beyond 24mm. What’s clear to me is that I’m well covered between 35mm and 85mm with some great lenses but where I’m lacking in a decent Ultra-wide. The poor quality of the 18-55mm Kit Lens disqualifies it as a contender.

Hence my intention is that I definitely don’t need or want the Kit Lens anymore. It’s just not up the standard I’m looking for in terms of sharpness. Also, as hard as it is for me to part with it, the Tamron doesn’t fit a need I have any more. The gap I need to fill is in the Ultra-Wide category which is difficult with a crop-sensor to achieve, but 24mm isn’t enough. My intention therefore is to replace them both with a sharper Ultra-Wide Lens.

Which lens that is, I’m still uncertain, though the 10-20mm Nikon looks nice.

]]>
Photography 2020-09-05T06:00:00+10:00 #TechDistortion
WhitePapers And Photography https://techdistortion.com/articles/whitepapers-and-photography https://techdistortion.com/articles/whitepapers-and-photography WhitePapers And Photography I’ve had two side projects in the past few years I think it’s time to consolidate into TechDistortion. As of today all of my Control System Space whitepapers will be moved here and my PixelFed instance, Chidgey.xyz will be redirected here.

Having looked at the type and volume of traffic it makes sense to consolidate them now rather than let them continue on for another year at their current homes.

The intention is to keep Podcasting over on The Engineered Network and everything else here at TechDistortion.

]]>
Technology 2020-07-04T06:00:00+10:00 #TechDistortion
Podcaster To AudioBook Narrator https://techdistortion.com/articles/podcaster-to-audiobook-narrator https://techdistortion.com/articles/podcaster-to-audiobook-narrator Podcaster To AudioBook Narrator I’ve been told for many years that I have a lovely voice, even before I started podcasting; lots more since then. Whilst I’ve been known for my accents and impersonations as well, of which some have actually got me in serious trouble in years past, it seemed a logical extension to consider audiobooks and vocal acting.

Upon putting my name down at an agency I wasn’t sure what to expect and when I landed an audition, then I landed the narration job for an audiobook! I was ecstatic. Once that wore off I signed the contract and realised I was now on the hook to record, edit and supply a complete audiobook that someone else had poured their time, energy and effort into creating the written version of. It was my job to narrate that book and make it sing!

Easy huh?

Oh boy.

I think it’s fair to say that I underestimated how much work it would be and looking back, just how much I learned in making it.

Some of the key lessons I learned from this experience that weren’t obvious to me when I signed up:

  • It is NOT possible to record even a short book in a single recording session especially in the midst of a COVID19 lockdown. My house is my recording studio and with the lockdown restrictions my recording periods are very brief, disjointed and highly problematic. Whilst I accept in future this won’t always be the case it made this particularly challenging. Children, TVs and music blaring, neighbours with too much gardening time on their hands mowing their lawns constantly, a Harley Davison motorbiking enthusiast up the street, it was incredibly frustrating!
  • Keeping a consistent pacing of speech, the same tone and pitch between recordings is extremely difficult. I learned to record in blocks wherever I could to avoid differences in my voice, and keep my positioning in front of the microphone identical every time.
  • Test your gear twice before you start a recording session! I unfortunately had a bad cable and I didn’t realise until I had an hour recorded! I had to re-record all of it.
  • If you put down some audio and you start editing and the levels aren’t the way you like, admit defeat early and re-record it! I made the mistake of persevering with sub-par audio for several hours of editing but after a few listen-backs to the finished product, I just couldn’t give it to the client. It wasn’t good enough. I should have cut my losses hours earlier and admitted I’d had a hardware failure and just re-recorded before I spent any time trying to salvage it.
  • Make sure you pre-record at your set levels, keep the same recording booth layout and confirm all the way through your workflow to your audio editing final output to ensure every link in the chain is set correctly before recording for any significant duration.
  • Scan a few words ahead, read those words after a delay in your brain, listen back to what you said whilst re-reading the same text to confirm you read exactly what was written. This is as hard as it sounds, but after about the 3 hour mark I started to get the hang of it. Like learning Morse Code I was amazed my brain was able to bend itself around that way of read/speak/reviewing but it actually is possible.
  • You might be recording multiple chapters spread over different recording and editing sessions, but the end listener will be listening in succession, hence between Chapters take an extra step and match the volume levels in post-production between each of the Chapters as the final listener will notice the differences.
  • This is someone else’s hard work. When you’re being paid to turn it into an audio form, you need to do your absolute best job to make their work SING! Give them your very best, don’t phone it in. If you need to re-record a sentence, a paragraph, a chapter, the WHOLE THING because it doesn’t make the grade, then just do it and do it right!
  • I edited in Ferrite on my iPad (as I do all my podcasts) and unfortunately there was a strange volume glitch (I submitted a bug report to the developer), but fortunately I learned how to work around it by force-restarting after a second track import which fixed it. Unfortunately I’d sent out a badly volume-matched final audio chapter before I realised the problem was with Ferrite. Not a good look.
  • Expect feedback from the client. I didn’t submit a single full chapter without at least one suggestion for improvement. Sometimes the written word just doesn’t translate into a spoken sentence that sounds correct. Some abbreviations should be spoken in full and others not. The pacing of some sentences and the emphasis might need to shift. I had all that sort of feedback but by incorporating it, I know the client will get the result that they want. It’s their book!

Of course this is the first audiobook I’ve ever recorded for a client. Realistically though it wasn’t what my friends and family expected. Firstly it wasn’t fiction, I didn’t do any voices, and spoke in my normal accent. In some audiobooks I’m aware of, narrators tweak sentences and ad-lib to an extent, lending their own personality to the reading. That isn’t always the case and wasn’t for this book.

Am I Planning Another AudioBook?

Absolutely yes, I am. I’ve done another audition and I’m working on my own series of AudioBooks as an Author-Narrator. The next time I’ll have a much better idea what to expect and am intent on doing an even better job each subsequent book I narrate.

So How Long Was This Book?

The book runs for just a touch under 3 hours, which is quite short for an audiobook but I speak pretty fast. A “normal” narrator should take about 3.5hrs for the same word count. That said my client loved the pacing and that’s what matters to me.

So in terms of Raw Audio, unedited, including all re-records and edits was 5.3hrs of raw audio. The entire book took approximately 28 hours to record, edit, re-record, normalise, remove noises, review and organise ready for release.

That’s a lot. I suspect I’ll get better next book but it’s no walk in the park. I lost about 10 hours where I had to re-record effectively a third of the book so that didn’t help…

Conclusion

The book is “The Knack Of Selling” by Mat Harrington. In reading the book I have to admit, I learned a lot of little things I had long suspected were salesperson “tricks” and a few things I hadn’t picked up on too. So to be completely fair, not only did I record this book for Mat, I learned a lot about sales while I was at it!

It’s currently available on iTunes and the Google Play audiobook stores.

I’m planning my own audiobooks in future and I’m going to record some of my accents as well on my profile page at Findaway Voices.

If you’d like me to record your audiobook, reach out and let me know. I’d love to help bring your work to life too!

]]>
AudioBooks 2020-06-18T21:30:00+10:00 #TechDistortion
Until Overcast For Mac Comes Out https://techdistortion.com/articles/until-overcast-for-mac-comes-out https://techdistortion.com/articles/until-overcast-for-mac-comes-out Until Overcast For Mac Comes Out I listen to podcasts a lot. Though less since I’ve been working from home full time. I want everything to channel through my desktop when I’m in front of it, so the best option for me is an integrated Podcast player that works on all iOS platforms, including the iPad, iPhone, Apple Watch and macOS. The Apple Podcasts app meets this requirement but I don’t like the missing smart speed, nor the way it handles playlists, podcast specific settings and so on that Overcast handles just the way I like it. (I’m a creature of habit too, I suppose)

Of course Marco has toyed with spending time developing a macOS port of Overcast but until that happens I needed a work-around. The requirements for my use case:

  • Use the Macbook Pro Audio System (External Speakers via the Audio Output on my Thunderbolt Dock)
  • Control Playback/Pause from the Macbook Pro keyboard
  • Keeps sync position for Overcast

I tried Undercast and a few other web-wrappers but to be honest, they were all terrible. The Web player is a bare-minimum passable option that gets you by in a pinch but that’s all. Then I remembered you can turn your Mac into an AirPlay receiver by using an app from Rogue Amoeba. AirFoil Satellite can be trialled free but a licence costs $29 USD (plus applicable taxes). I had a copy laying around from years ago and I always just install it (just in case) on every new machine.

Open AirFoil Satellite and set a Play/Pause shortcut that makes sense for you (I chose Command-Shift-P) and then write an AppleScript to activate and then send the keyboard shortcut and give that a keyboard shortcut via FastScripts. I chose F17 (I love my extended keyboard).

  on run
    if application "Airfoil Satellite" is running then
      tell application "Airfoil Satellite" to activate
        tell application "System Events" to tell process "Airfoil Satellite" to keystroke "P" using {command down, shift down}
      return
    end if
  end run

It’s not perfect but meets my criteria. There are other applications out there that do similar things and I’ve had trouble with Automator since the Catalina update restricting what can be executed as a global shortcut from ANYWHERE, which is why I’ve switched to FastScripts.

Hopefully that’s useful to someone, until native macOS app is released in the future. You can just load up your playlist, pipe it through your desktop speakers, sync position is kept, smart speed is your best friend, and away you go :)

]]>
Technology 2020-04-10T08:15:00+10:00 #TechDistortion
Docks And Interference https://techdistortion.com/articles/docks-and-interference https://techdistortion.com/articles/docks-and-interference Docks And Interference For the most part I’ve enjoyed my 13" Macbook Pro TouchBar 2018 model with questionable keys, but shifting to a fully work from home environment due to our unfriendly cold virus in recent times, I’ve begun to rely more heavily on a full time setup. At work in an office I’d be up and down, in and out of meetings, and could write off the occasional glitches as a downside of working in a large downtown office building in the middle of RF pea-soup.

No so much at home.

As an electrical engineer with a background in radio I’m well aware of the issues with wireless connectivity. Particularly low power wireless, even broadband or spread-spectrum technologies can be thwarted by enough radio interference. So when I purchased a brand new Apple Magic Mouse 2 a few weeks ago, I could no longer avoid what had been nagging at me for over a year: there seemed to be something wrong with my Macbook Pros wireless connectivity. (Spoiler: So I thought)

Symptoms

I’ve had a Bluetooth Apple Magic Keyboard and Magic Trackpad 2 for over a year and they would occasionally disconnect from the Macbook Pro, and on the keyboard my keystrokes would occasionally lag behind what was shown on the screen. For the longest time I shrugged it off, it was passing and temporary.

Starting the use the Magic Mouse 2, I was irritated in the first minute I used it with a stuttering cursor across the screen. As a part of working from home I’ve been on Skype for Business, Microsoft Teams, even (Shudder) Zoom audio and video conferences, on some days for 9 hours straight. The obvious thing to reach for are my AirPods. They’re only six months old and the audio in my ears sounded perfectly clear, however I was getting consistent complaints from others on the conference call that my audio was breaking up, yet I was connected by hardwired Ethernet to my router and my Upload/Download connection speeds were first rate.

Diagnosis

Being a semi-professional podcaster (some say) I had plenty of audio gear to test my microphone and quickly connecting my MixPre3 and Heil PR-40 to the Macbook Pro, now using the MixPre3 as the Microphone and my AirPods as the receiver, there were no issues with audio any more. I noted that when connected to my iPad or iPhone the AirPods had no microphone drop-outs. At this point it was clear the problem was proximity to the Macbook Pro or the Macbook Pro had some issue with wireless connectivity, specifically these Bluetooth devices. To further confirm the mouse stutter wasn’t the mouse itself I borrowed my sons wired USB Mouse and noted that it did not stutter when connected via the USB hub or via the Thunderbolt dock.

Next I cabled my Magic Keyboard 2 to my USB Hub, hence disconnecting its Bluetooth connection. The Mouse stuttering continued, though it appeared to be marginally better. Turning off the trackpad and AirPods entirely and the stuttering seemed ever so marginally less pronounced though it was still visible and jarring.

Then to attempt to isolate further I disconnected the Macbook Pro from power with no change. I then disconnected the USB Hub, and the most marked improvement in stutter was clear. Then I turned my attention to the only other item connected: the StarTech.com Thunderbolt hub. At this point the Stuttering was gone.

Image of StarTech.com Adaptor The StarTech.com with my attempts to shield and repair the cable

Not Very Useful

I tried to wrap the StarTech.com cable with an RF Choke, shielding, but whatever noise it was producing would not be silenced. I needed to connect the Macbook Pro to multiple screens and I needed hardwired Ethernet and I only had 4 USB-C ports (mind you that’s better than some of Apple’s laptop machines).

I’d been eyeing one of these off for what seems like years (more like 18 months) so I finally ordered the CalDigit TS3+ Thunderbolt dock. I ordered it via Apple and it arrived only two business days later.

CalDigit TS3+

Devices I currently have plugged into the TS3+:

  • Audio Output to my desktop speakers
  • Hardwired Ethernet to the router
  • Thunderbolt cable to my Macbook Pro
  • DisplayPort to 4K 28" Monitor #1
  • Thunderbolt Downstream to Cable Creation DisplayPort adaptor to 4K 28" Monitor #2
  • USB-A to 8TB HDD
  • USB-A to a Qi Charging Pad
  • USB-C to MixPre3
  • AC Power Adaptor (from the wall socket)

I’ve tested the SD Card reader (can pack away my old multi-card USB 2.0 reader now), and all of the other USB-A ports plus the USB-C front port but they’re currently vacant. With this dock I packed away my USB-C 61W charger and Apple’s Macbook Pro USB-C cable as well. My Magic Keyboard 2 is back in Bluetooth mode, so’d the Magic Trackpad, the Magic Mouse and the AirPods and guess what?

No Mouse Stutter

No Audio Dropouts of the Microphone from the AirPods

Okay so was this a case of throwing money at a problem to make it go away? Kinda sorta, but truth be told it was more an expensive process of elimination.

Magic Keyboard, Magic Mouse, AirPods All BlueTooth Devices now Happily Working Simultaneously

Interference

The problem lies in one of three places, as it always does with anything wireless. For communication between two places you need A) a transmitter, B) a receiver and C) the transmission medium joining the two. In this case, the transmitter probably wasn’t a factor - everything was within tens of centimetres from each other so single strength wasn’t a problem, though interference could still be a factor for a receiver. A broad spectrum interferer would impact the devices no matter where you were in the house, no matter what you disconnected or didn’t - which eliminated a common interferer.

So it comes back to the transmitter or the receiver and the perspective of each. From the Mouse/AirPods (acting as a transmitter, sending data to the Macbook Pro) it has a relatively small battery to transmit BlueTooth back to the Macbook Pro. The mouse isn’t a receiver (well it is but it’s one we can’t test independently) and the AirPods as a receiver for audio playback (from the Macbook Pro to the AirPods) has a more powerful transmitter in the Macbook Pro to listen to.

If you have a localised interferer it will tend to drown-out the nearest radio receiver. In this case whatever is trying to communicate with the Macbook Pro via BlueTooth is going to struggle to pick out the desired signal over the top of the noisy interferer. How this manifests in this situation is lost data from the weaker transmitter (the battery powered device) to the receiver in the Macbook Pro. In the case of the:

  • AirPods: broken up microphone audio
  • Magic Keyboard: occasionally delayed or lost keystrokes
  • Magic Trackpad: delayed selection/tapbacks, stuttering cursor movements
  • Magic Mouse: stuttering cursor movements

Hopefully that all makes sense but what was causing the interference?

First About Bluetooth

BlueTooth operates between 2.400 and 2.485 GHz which is a narrow(-ish) 85 MHz of spectrum. Notwithstanding the guard bands at the top and bottom of that spectrum it operates using 79 channels each of 1 MHz bandwidth using Frequency-Hopping Spread-Spectrum technology. FHSS allows narrow band interference to be avoided by constantly hopping between segments of the spectrum within any given channel. Of course that’s fine if you only have narrow band interference. Broadband interferers that spew noise across vast segments of a band will cause enough data loss to drop packets.

USB 3.0

‘Superspeed’ USB (aka USB3) has delivered significantly faster data rates for several years but as clock speeds increase the frequency of interference increases to a point where the EMI (Electro-Magnetic Interference) emitted is centered around the base clock frequency and multiples thereof such that it’s difficult to obtain compliance to EMI standards in some frequency bands. To avoid multiple narrow-band EMI peaks across the frequency band and in an attempt to reduce EMI, the concept of spread-spectrum was applied to data clocking (in a manner of speaking). There’s an excellent article by Microsemi that explains: “Spread spectrum clocking is a technique used in electronics design to intentionally modulate the ideal position of the clock edge such that the resulting signal’s spectrum is ‘spread’, around the ideal frequency of the clock…”. This has the effect of spreading the noise across a very wide frequency range, significantly reducing narrow-band noise, but at the cost of increasing spread-spectrum noise.

Intel released a White Paper in 2012 that looked at the practical implementation of USB 3.0 and how the technology had an impact specifically on low powered wireless devices operating in the 2.4GHz band. Specifically WiFi and BlueTooth. The following table is extracted from that White Paper and shows the noise increase due to an externally connected USB 3.0 Hard Disk Drive.

USB 3.0 Interference: Credit Intel 2012 Figure 3-3

Intel’s commentary: “…With the (external USB 3.0) HDD connected, the noise floor in the 2.4 GHz band is raised by nearly 20 dB. This could impact wireless device sensitivity significantly…”

The Root Cause

In years past when I had access to an RF Spectrum Analyser I could have connected some probes to stray cables and known for certain, but based on a process of elimination it’s clear that there were two interferers most likely due to USB 3.0 components:

The StarTech.com dock started to cut out intermittently over 9 months ago. The cut-outs caused a HDD to disconnect multiple times leading to a lot of frustration with directory rebuilding, reindexing and backup re-uploading such that I couldn’t leave it connected to my Macbook Pro via the dock anymore. That drove me to seek out an independent USB hub, so I’d switched to a combination of CableCreation USB-C to DisplayPort adaptors and a cheap Unitek USB-3 Hub via a cheap Orico USB-C to USB-A adaptor. This solution worked for a while but it ultimately consumed too many ports and once I had shifted to working at home full time, wouldn’t work.

Through use and abuse in the case of the StarTech.com dock I’ve come to appreciate that the shielding and cabling was damaged, and in the case of the cheaper USB 3 Hub from Unitek, I doubt it was ever particularly well shielded to begin with and I essentially got what I paid for as it was rather cheap.

USB Hub and Adaptors Miscellaneous Adaptors I Used Along The Way

Well Shielded Cables Please

Poorly shielded cabling relating to high speed external data buses is far more often the culprit that you might think when you’re experiencing BlueTooth or WiFi issues. Whilst it’s true there are many layers to the comms stack, it’s also possible it’s purely a software issue, it could be a faulty BlueTooth device as well. Having said that, swapping out cables and docks may well solve your problems definitively.

I like to think about shielding as the bottle and RF Noise as the genie. Once that shielding is damaged or if it’s poorly designed or constructed, it lets the genie out of the bottle and once it’s out, it’s incredibly difficult to stop it interfering with other devices.

My advice: choose your USB hubs, devices and cables with care and treat them well, lest that EMI genie be let out of its bottle.

Hopefully this helps someone trying to understand why their BlueTooth devices are misbehaving, when said devices are in otherwise perfect condition.

]]>
Technology 2020-04-08T21:25:00+10:00 #TechDistortion
Kia Optima 2018 Auto-Steer https://techdistortion.com/articles/kia-optima-2018-auto-steer https://techdistortion.com/articles/kia-optima-2018-auto-steer Kia Optima 2018 Auto-Steer My rental vehicle in the US was a Kia Optima FE and it had a lot of extra little features I’d never been exposed to before. The one of most interest was auto-steer, or “lane keep assist” it’s sometimes referred to as.

The way I discovered it had this feature was when I was driving to Austin on a slow left hand bend when I felt the steering wheel start to pull me off the road. Ever so slightly disconcerting at 70mph! What the heck was tugging on the steering wheel? I initially thought the car needed a wheel alignment or the tyre pressures were badly off.

Thinking back I’d been having warning alerts go off in the hour previously but didn’t know what they were for. I realised that it was complaining about my lane position. One of the challenges when you’re driving on the other side of the road is that the sight-line you’re used to using from the driving position to the center or outside lines of the road to get your correct road position is thrown out by sitting on the other side of the vehicle.

After a few days driving on the right hand side of the road I’d retrained my brain so that’s fine but the car was pointing this out to me for several hours before I realised what it was doing. (Please note: I wasn’t drifting OUT of my lane, but I was too far across to the right hand edge of my lane, not enough to cause an incident but enough to upset lane-keep).

Back to Auto-steer. I realised through observation that the green steering wheel icon would appear at speeds above 40mph when the car could “see” solid or regularly dashed lines on either side of the roadway ahead of it. If it did see them I could let go of the wheel for a period of time and the car would then keep itself in the lane. It worked well enough but there were a few little problems.

  • Sharper bends were a fail: I pushed the car’s limits a bit on this one, with my hands at the ready as I let it steer through ever-sharper turns but ultimately I learned when I pushed it too hard to not trust it to steer itself on anything other than the most gentle of curves in the road
  • Missing lines caused jerking: This is what happened in the first incident I mentioned - there was a gap in the outside line of the road due to a series of driveway entrances on a more rural section of the highway which confused the auto-steer system
  • The no-hands on wheel alarm: After about 20 seconds of not touching the wheel the system would alert you to the fact you hadn’t been holding the wheel and cut out auto-steer if you didn’t grab the wheel. In practice when I was lightly holding the wheel it wasn’t detected at all especially on a straight stretch of roadway and I had to forcibly inject a small correction into the wheel even if it wasn’t warranted to convince it I was actually holding the wheel.
  • On freeways with lots of merges it’s rough: Particularly in heavy traffic I just turned if off and stopped using it. It wasn’t safe and I didn’t trust it. To be fair I have the same policy with a cruise control - it has no place in heavily congested traffic at those speeds.

It’s not all bad news and limitations however:

  • You don’t drift if you look away anymore: You can say it as much as you like, “always keep your eyes on the road” and if you need something from the passenger seat, glove box, sometimes even the radio, the advice is “pull over until it’s safe to do so”. The counter-argument with freeways is that this isn’t usually practical - most freeways don’t have wide enough shoulders to safely stop, there’s too much traffic to safely stop, they don’t have enough exits set aside for breaks - once you’re on it, you’re stuck on it. Hence if you do look away from the road, the direction that you look or lean no matter how good a driver you are, you’ll start to drift the car in that direction. With this feature - that’s no longer an issue.
  • Less tiring: I wouldn’t have thought it would have such an impact but driving back late at night when you’re tired the Auto-Steer made a huge difference. I found I could focus more on the cars around me (the few that were) and the map guidance and let the car take that cognitive load off of me. It worked really well.

I’m strongly considering a Tesla Model 3 or Model Y in a few years time when it’s time for my next car and I’m now more excited than ever that this kind of technology is becoming cheaper and hence more accessible and whilst the Kia implementation (according to others reviews I’ve read) isn’t as good as Teslas, it’s still good enough to be useful and I’m glad I had it.

]]>
Technology 2019-11-12T06:00:00+10:00 #TechDistortion
To Tip Or Not To Tip https://techdistortion.com/articles/to-tip-or-not-to-tip https://techdistortion.com/articles/to-tip-or-not-to-tip To Tip Or Not To Tip It’s been nearly two decades since I was in the USA and understanding how and when to tip has evaporated from my memory. I do recall however that tipping used to be a bit less, perhaps 10% of the bill whereas this trip the helpful suggestions were starting about 20%. My understanding is that the minimum wage hasn’t kept up with inflation and as a result people are relying more on tips today in the USA than ever before.

Having said that, I was told that tipping through drive-through isn’t generally the done thing and whilst you are technically served by someone in Target, Best Buy or a goods purchasing store, tipping isn’t required in those instances as they have a higher hourly rate that factors the lack of tipping in.

The idea seems to be the more personal, face to face, “service industry” (which can be confusing since someone telling me about a product in Best Buy is still ‘serving’ me) this industry is where you’re expected to tip, proportional to the service that is offered by the staff.

Okay so far I’m wrapping my head back around it. Next problem: when I came to the USA previously there were two types of transactions in the majority: cash and credit card. Cash was easy - they give you the bill and you pay them that amount plus a bit extra for the tip. Then you can ask for a receipt if you like. Super simple.

With credit cards in a sit-down restaurant environment you’d be given a small folding wallet thing, with a bill in it and a slot for your card. You’d fill in the tip amount, insert your card in the wallet and hand it back. Then they would wander off with it and hopefully come back without skimming your card and you’d sign and you’re done. Although requiring some degree of trust that was also straight forward to me.

Where I got lost this time was the introduction of payment at the till using a credit card either inserted (chip), swiped or pay-wave. In these cases they’d show me the amount, I’d usually insert my credit card, they’d print a receipt then I’d sign it, add a tip, then total it, then hand it back to them. At that point what happens? I’m assuming the original transaction is re-run or something? It’s not clear how that authorisation happens but they all seemed to accept this. Oh well, hope that worked. In those cases sometimes they’d give me a second receipt that included the tip amount, other times they wouldn’t with some looking confused when I asked for a receipt since I was still holding a pre-tip-filled-in copy.

The final conundrum was when it wasn’t a seated meal in a restaurant, when you’re just getting takeaway but it’s not via a drive-through I was given conflicting advice on whether to tip. The most regular example of this was a Barista. I defaulted to a tip for them however in the end I did it because I didn’t want to upset anyone, rather than it being a reflection of service.

The problem is that if you don’t grow up in a tipping culture, there’s no accepted set of rules and a lot comes back to the potential to reward good service or if you’re confused about whether tipping is the right thing to do, you end up insulting someone that’s good at their job that probably deserved a tip, at least in their opinion or based on the rules they are told apply.

I was once lectured by someone that grew up in that culture after they visited my country and they were horrified by a bad experience in a hotel blaming it all on our country’s lack of tipping leading to poor customer service. That was 20 years ago mind you, but I’m not entirely sure it’s that simple.

Either way towards the end of my trip I was so confused about the tipping grey areas I realised I was developing a ‘tipping anxiety’ where I was starting to avoid situations where it was unclear when I should or shouldn’t tip or how much to tip. Sigh.

Maybe I’ll do better next t(r)ip.

]]>
Technology 2019-11-11T06:00:00+10:00 #TechDistortion
My Texas Driving Experience https://techdistortion.com/articles/my-texas-driving-experience https://techdistortion.com/articles/my-texas-driving-experience My Texas Driving Experience When driving through Texas this past week I was greeted by those overhead digital signs that have an inspirational or perhaps cautionary message. Of the messages I saw, one in particular stuck out in my mind. Whilst I didn’t write down the exact wording the message in essence was 2,871 people had died on the roads in Texas in 2019 so far.

Given that the message was up the entire time I was there, I expect this was for January to October inclusive (about 300 days) which is 19 people killed every two days in Texas alone.

Okay, so Texas is a big state and has a big population, so what’s that equate to in terms of people killed per head of population? There are 28.7M living in Texas as of 2018 which isn’t that different from all of Australia (25M). So the current statistics in Australia from January to September 2019: 914 people killed (1,015 corrected over 10 months) for an average of 6.6 every two days, which means that in Texas there are 2.5-3 times as many people killed than in Australia.

In conjunction with this I’d like to point out a few other observations with comparisons to Australia:

  • Speed Cameras: In the whole time I was driving in the USA this week I counted one speed camera - a roadside trailer mount unit. I never saw a speed camera on a traffic light, intersection, or mobile van. I’m sure they exist and maybe I missed them? In Australia scarcely a day passes when I’m driving when I don’t see at least one mobile unit, or trailer mounted unit and my commute takes me through one twice each day. In Australia the detractors would tell you they are merely revenue raising machines but the truth is they make a lot of people think twice about speeding.
  • Speeding: In Australia I sit on the speed limit and on the freeway I’ll get overtaken maybe every 15-20th car at most, whereas in Houston and driving between Corpus Christie and Austin I was driving on the speed limit and was overtaken by almost every car! My best estimate was that most cars were driving 5-10mph over the speed limit. It was slightly scary.
  • Dangerous Driving in Wet Weather: During the wet weather in Houston on Thursday I was tailgated, cut off multiple times and the other drivers seemed to not care that it was wet with many still speeding and overtaking as they had in the dry. The amount of risk taking was insane, and whilst I won’t pretend that Australian drivers are angels, there was far more respect, slower driving speeds and caution in the wet, especially heavy rain.
  • Running Red Lights: On a typical commute in Brisbane I’ll see maybe one or two cars run a red light, however we have red light cameras fitted at many intersections so most of the time people don’t or won’t risk it. I lost count of how many cars blatantly ran red lights and honestly I began to pay additional attention to make sure everyone had stopped before I entered intersections, much to the annoyance of those behind me. Self-preservation y’all.

It’s likely that the high-density traffic in major cities is a focal point for accidents and it’s possible that due to large Texas cities having many freeways and congestion that this amplifies impatience and may go some way to explaining the tripling of the road toll compared to my home country.

In the end there’s probably a lot of complicated reasons why it’s so horrific but either way you slice it that’s a massive amount of bloodshed on the roads. There are other places in the world where people drive their cars just as much or even further on average, at or above those speed limits with significantly less fatalities. It can be better.

Anywhere you’re driving, drive safely. Please. Really, seriously please drive safely.

]]>
Technology 2019-11-10T11:30:00+10:00 #TechDistortion
IndieWeb Meetup https://techdistortion.com/articles/indieweb-meetup https://techdistortion.com/articles/indieweb-meetup IndieWeb Meetup Once I knew the conference dates in the States I realised that the IndieWeb meetup in Austin would be happening on the first Wednesday of the month, which was an evening when I would be in Houston. Noting it was a mere 2.5 hour drive (far closer than a 28 hour door to door flight) I decided to drop by.

I arrived at 6:30pm exactly, met a fellow geek who recognised my geekiness from my shirt and mentioned it was his first time coming to a meetup, not knowing what anyone looked like. Initially we didn’t see anyone else obvious so I ordered a coffee and then we checked again.

I recognised Manton immediately and we found a table to fit us all - seven in total. After introductions we talked about web development, the differences between ActivityPub and WebMention, different projects and sites we’re hosting and how, podcasts we’re involved with and lots and lots more.

It’s odd but for most of us being complete strangers it really felt quite comfortable and as I look back as I’m writing this I realise just how much I’ve missed out on not living in or near hubs where like minded software developers tend to live. Austin has become a focal point, San Francisco has been for some time as well whereas in Australia there aren’t really any I know of, perhaps Adelaide up to a point, certainly none near me.

As the evening was closing Manton walked through the upcoming IndieWebCamp which sounds really interesting so if you’re a developer in the area I’d check it out.

We talked for over 1.5 hours in total and I had a great time. If you’re in the Austin area and you’re interested in becoming or already are a web developer then I highly recommend dropping by to a meetup. The venue is usually Mozarts Coffee, which make great coffee and have a wonderful setup and no issue parking, though to be safe I’d follow Manton for announcements and updates.

Thanks to Manton Reece for organising it and to everyone else that attended and made me feel welcome.

]]>
Technology 2019-11-09T11:30:00+10:00 #TechDistortion
USA Fast Food Part Two https://techdistortion.com/articles/usa-fast-food-part-two https://techdistortion.com/articles/usa-fast-food-part-two USA Fast Food Part Two As I’m now flying back home it’s time to recap my culinary experiences since my last post. Again the following aren’t in order of anything:

  • Dunkin’ Donuts: Meal: Dinner (yeah I know keep that to yourself), Food: Chocolate Glazed, Maple Iced, Boston Creme, Mint Cookie and “GO TEXANS” special; My blood sugar was bottoming out and I needed sugar and since it was nearby I thought what the heck. Anyhow I actually found most of them to be quite dry. I liked the Maple, Boston Creme and Mint Cookie ones, but the others were quite forgettable.
  • Toll House Cookies: Meal: Snack, Food: Chocolate Chip, Double-choc Chip, White Chocolate Chip and Macadamia, Mini-choc chip with frosting; Honestly I couldn’t find anything to complain about for any of these. They bore a very similar taste to a local chain in Australia called Mrs Fields. Very nice though.
  • Aunty Mays: Meal: Snack, Food: Pretzel Dog; Grabbed this on a lark on the way to the gate to try it. Honestly thought it was pretty good though the “pretzel” was a bit doughy but then I suppose that’s intentional and/or inevitable.
  • Popeyes: Meal: Lunch; Drink: Diet Coke, Food: Chicken Sandwich with Chips; Firstly I was really impressed by the chips, nice batter, crispy and yummy. The Burger was originally offered a few months ago and was considered by some food critics (yeah, actual food critics) to be better than Chick-Fil-A but as it was a “Limited Time” offer at that point, Popeyes withdrew it from sale, only to return it to their menu co-incidentally a week before I came to the USA. Hence I was advised to get one this time around, noting that someone had been stabbed in line waiting for one only a few days prior. Hmm, well the one I had was very nice for sure though nothing I’d stab anyone for and I still think the Chick-Fil-A Chicken Sandwich was nicer overall. That said, it’s probably a second ahead of KFC.
  • Denny’s: Meal: Breakfast, Drink: Coffee, Food: The Grand Slamwich; Coffee was also surprisingly good given what it was, like IHOP before it just don’t add anything to the coffee and drink it black. The Slamwich was actually really nice but I had no idea what to make of the shredded style Hash Brown. The potato seemed undercooked and I suppose falling apart is supposed to be a feature but it wasn’t one I wanted.
  • Cinnabon: Meal: Lunch, Food: Cinnamon Sticks and Classic Scroll; I found the sticks to be a bit average without the dipping cream/whip but the scroll was off the charts! I knew that Cinnabon were intending to open their first Australian stores in the next few months however they were next to me in the food court at the time so I had it anyway.
  • Waffle House: Meal: Breakfast, Food: Original Waffle with a side of bacon; I was stunned how good the bacon was, but to be honest the Waffle was a bit tasteless, no matter how much whipped butter or syrup I added.
  • Starbucks: Truth be told I drank the same drink multiple times during the week: Venti Latte with an extra shot. Despite what I’d heard, Starbucks in the USA wasn’t that different from Australia. And whilst I normally have my Venti’s with an extra shot I had to repeat that request almost every time to each barista - apparently that’s an odd request over here.
  • McDonalds: Drink: Coffee; Having compared Starbucks coffee to Australia it wouldn’t be fair if I didn’t also compare McDonalds coffee as well. Unfortunately the USA version of McDonalds coffee was pretty bad compared to home. About 9 months ago at home McDonalds introduced their “New Blend” which was less over-roasted than their usual blend to that point and I’ve come to not mind it in a pinch whereas previously I’d only drink it if I was truly desperate. This tasted much like the over-roasted kind they used to sell back home, but it wasn’t that pleasant. The other item of note though: I asked for a large and OMG was it comically huge! It was at least 30% bigger than the same “large” in Australia. I wasn’t prepared for that and therefore couldn’t finish it - there was just too much.
  • Mozarts Coffee (Austin): Meal: Dinner, Drink: Regular Latte, Food: Snickerdoodle Cookie; Okay I made the side trek to Austin to go to the IndieWeb meetup and meet Manton Reece of whom I’ve been a fan for years. Whilst there I had what was actually the only really nice quality coffee I had in the USA. That said, I never tracked down any specialty roasters in Houston and just tripped into Starbucks almost every time, which was easy since they were EVERYWHERE! Anyhow the cookie was a bit dry and crumbly but was quite tasty.

To reiterate the following notes once again:

  • I am not a food critic
  • Repeat: Not a food critic
  • Your taste bud mileage will definitely vary and all tastes are very different
  • There are many other options on menus but I can’t try them all in a week

In summary I’m really glad I tried this fast food. I almost sensed a bit of bewilderment from some of my friends. I got the feeling they thought I should be eating “better” options rather than the most popular Fast Food chains. Some suggested restaurants with award winning dishes and their personal niche chains for example.

I considered their suggestions seriously and decided the way to think about it was this…

The Fast Food chains I tried are a mixture of good marketing, good pricing, good food and overall popularity amongst a significant number of Americans. If I truly want to have the most representative American food experience then I should start with those restaurants and fast food outlets that the majority of Americans prefer. If they didn’t prefer them, they wouldn’t have succeeded in their business. Hence most of my choices.

Both of our countries have brought different culinary options to the table and the world is a better place for it. I’m grateful for the advice from my friends and family on what to try, and I regret nothing that I tried this trip. It was fun but I’m ready to get back eating healthier meals again now. My body is quite frankly done with junk food for a few weeks. (At least)

I look forward to returning to the USA again next year to sample some more.

Thank you America :)

(…until next time…)

]]>
Technology 2019-11-08T11:30:00+10:00 #TechDistortion
Audible Alert https://techdistortion.com/articles/audible-alert https://techdistortion.com/articles/audible-alert Audible Alert The other night, I was just minding my own business having dinner at Olive Garden, enjoying a bread loaf…I mean “stick” yeah that thing’s more like a bread trunk or bread branch than a stick.

I digress…

An odd alert sound went off through the entire building. At first I thought it was a car alarm going off outside. It wasn’t. I looked around and nobody seemed to be reacting, flinching or panicking. In fact, most people looked as though there was nothing out of the ordinary and kept eating, talking and walking by. I was, frankly, puzzled. My iPhone is currently on an international roaming agreement with AT&T and then I received an Amber Alert on my phone.

Those people that follow me know that I always turn the volume off, using my Apple Watch for haptic feedback for incoming calls, messages, everything so I was shocked when my phone made noise and started vibrating! I had no idea what an Amber Alert was, so I Googled it (as you do) and realised that it was the US Emergency Warning system. I had heard of it, but never connected what it was until I read about how it worked.

I hope they find the child that was abducted - that’s a horrible thing and not unique to America. It happens the world over and it’s terrible.

The also disturbing part for me upon reflection was the lack of movement, lack of concern, lack of any real detectable reaction from the locals in the restaurant.

I study control system, human interfaces in my job and there’s a field of study that focuses on desensitisation of people to repetitive alarm inputs. How often must people be getting these alerts to have that reaction - i.e. no reaction? I looked it up. In 2018 there were 200 Amber Alerts issued nationwide averaging about one every two days. According to the Amber Alert website, as of April 2019 957 children had been rescued specifically because of an Amber Alert since the program began in 2006. Of course that’s an amazing result but I can’t get past the reactions of the locals.

Systems like this will fade in effectiveness with the passage of time, it’s inevitable. In the meantime I just hope that people don’t treat them like a nuisance EMail alert, and pay attention.

PS: I looked for a vehicle matching the description in the car park and on the drive back to the hotel. I did not see it.

]]>
Technology 2019-11-07T11:30:00+10:00 #TechDistortion
USA Fast Food Part One https://techdistortion.com/articles/usa-fast-food-part-one https://techdistortion.com/articles/usa-fast-food-part-one USA Fast Food Part One I used to have a dangerous weight problem, and failing other weight loss methods over years I had weight loss surgery. It’s been hard and I’m not even sure I recommend it, but it was effective enough. So since then keeping my weight under control has been easier but I still need to be careful. If I eat too much it hurts. Physically. Stabbing pains. Ouchie - no no. I generally can’t drink anything until 45 minutes after I’ve eaten so if I have a few sips before I eat that’s the way to go.

That said, I’ve been away from the USA for nearly two decades and with TV being more international, listening to lots of podcasts by Americans and between Twitter and Facebook I’ve heard many references to Fast Food outlets, various restaurants and the like - most of which don’t exist outside of the US.

Hence when over here for a conference I made it my personal mission to try as many as reasonably possible. Without making myself feel ill or enduring searing stomach pains…

I’ll release a full update on the flight back, but for now here’s what I’ve tried and my thoughts on each. Please note: it’s not possible to try every single menu item, it’s a one-hit-one-outlet-one-meal kinda deal, so I’ve asked friends for recommendations and of course, done my best to pick…the following aren’t in order of anything:

  • iHOP: International House of Pancakes. Meal: Breakfast, Drink: Black house coffee, Food: Mexican Churro Pancakes; Okay so an odd choice for breakfast but damn these were amazing! The sauce between layers was delicious and over-powered any of the three syrups I tried (Old fashioned, Butter Pecan and Strawberry). Oddly the coffee actually wasn’t too bad. I’d heard horror stories but it was drinkable (black of course, no sweetener or anything)
  • Chick-Fil-A: Meal: Lunch, Drink: Freshly squeezed Lemonade, Food: Number 2 Spicy with Waffle-cut Fries; The Waffle Cut fries were a bit bland to be honest, and the Lemonade was really tart and I couldn’t drink it but oh my god, that chicken burger was next-level amazing! My previous favourite KFC chicken burger has been comprehensively pipped for top spot. Truly outstanding!
  • Olive Garden: Meal: Dinner, Drink: Diet Coke, Food: Three-sample platter with Lasagne Fritters, Cheese Sticks and Fried Ravioli plus three bread-logs/sticks; Diet Coke is Diet Coke, dunno what to say, the bread sticks are comically huge but very nice - remind me of Sizzler cheesy toast in flavour a bit. All of the items on the sharing platter were amazing. I also love the on-table ordering device where you could order, play games (why?) and pay for your meal or summon people with it. I paid using it and was gleefully delighted despite the fact the waiter was standing right next to me the whole time for moral support. Thanks mate :)
  • Cheesecake Factory: Meal: Dessert, Food: 1 Slice of Reece’s Cheesecake Cake, and 1 Slice of Cookies and Cream; Well these will last a few nights because I can’t have more than half in one sitting. They are both amazing and so evil I hate them because I love them. Ugh. Wow.
  • Whataburger: Meal: Dinner, Drink: Barq’s Root Beer, Food: Single Whataburger with Onion Rings; Root Beer was from a dispenser but geez the cup is made of styrofoam? And it was huge too and flimsy! That’s a regular size? I’ll ask for extra-small next time. Onion Rings were very nice - not the best I’ve had mind you but I’d say the best from a fast-food outlet. The Burger itself wasn’t that great until I added the Spicy Tomato Sauce and then all was forgiven - I’d put them above McDonalds and Burger King, and worth the effort for sure.
  • Newks: Meal: Dinner, Food: 16oz Chilli Bowl Soup, Slice of 6-layer Red Velvet Cake; The chilli was amazing and the cake was very nice, rich and couldn’t finish it unfortunately. Not as nice as Cheesecake Factory. Not even close.

Before moving on part two I’d like to add the following notes:

  • I am not a food critic
  • Repeat: Not a food critic
  • Your taste bud mileage will vary and all tastes are different
  • Doubtless there are other options on these menus to try but I can’t try them all in a week
  • I look forward to returning home when I’ll have to lose the 10 pounds I’m going to put on this week alone as a result of this :(

Part Two soon…

]]>
Technology 2019-11-06T11:30:00+10:00 #TechDistortion
USA Then and Now https://techdistortion.com/articles/usa-then-and-now https://techdistortion.com/articles/usa-then-and-now USA Then and Now In my last trip to the USA in 2000 I took a lot of photos and have a lot of fond memories. What’s interesting to me in returning some nearly two decades later are some subtle differences that some people mightn’t have noticed if they’d lived through the gradual change.

The mix of Cars is very different

When I visited I recall vividly being dwarfed by large trucks, Dodge RAMs, Chevrolet Silverados and the like, with many Buicks, Chryslers and American cars everywhere. Upon my return my rental car is a Kia, and on the road I see a roughly 50/50 split of US-made vs International (imported) vehicles. I realise that the US motor industry has been struggling in some dimensions but buyers not buying them isn’t a good thing. In Australia our local car manufacturing industry died only a few years ago. It’s now not possible to purchase a vehicle built in Australia. I hope the US doesn’t follow suit and whilst I’m sure it won’t entirely it’s a striking change in two decades and the source of some concern.

There are Fast Food Restaurants Everywhere

Maybe I wasn’t paying as much attention last time but I swear that on every city block on main roads there’s at least one food outlet. It’s also possible my memory of Houston is fuzzy (bound to be after so long) but it’s uncanny to me looking around as I’m driving. There’s no shortage of places to eat and my observation inside is that no one of them individually is particularly busy. Is there an oversupply to the market? Hmm.

OMG The Simpsons (S09E19) Weren’t Kidding About Starbucks

In 2000 I wasn’t drinking coffee, but I knew who Starbucks were. Back then there were 3,500 stores worldwide (okay I looked it up on their website) and today there are 27,340. I mean - holy F@cking Cr@p! In the Simpsons episode Bart is walking through the mall to get his ear pierced and is warned the owner that in 5 minutes it would become a Starbucks so he’d better hurry. As Bart departs, all of the stores were Starbucks including the one he was just inside. So yes obviously that’s an exaggeration, but the conference I’m attending is in Memorial City in Houston and in the Memorial Hospital there’s a Starbucks. There’s one in the Target, one in Macys and one in the dead-center of the mall itself. So that’s four stores in a radius of 750ft (230m)! That’s insane.

Having said all that there’s one thing I do remember about the busier parts of many US cities, of which Houston is no exception.

Concrete is everywhere

In other parts of the world using concrete for roadways such as freeways, highways, city streets and car parks is generally only an affordance spent on freeways due to lower rolling resistance, high load capacity and longer lifetime. It’s just too expensive to put it everywhere! Where I’ve been driving in Houston there’s concrete everywhere. It’s like everything is a shade of light-brownish-grey-concrete colour (I’m not an artist - it’s like concretey-colour), broken up mainly by tress and grass. Of course it’s not wrong exactly it’s just a really expensive way to do business. Then again those car parks won’t need much maintenance for the next thirty years and what’s a pot-hole? Not really an issue with concrete. The freeways also are an absolutely immaculate work of engineering art, with fly-overs, fly-unders and people speeding like heck! (Not me though, but the speed limit clearly isn’t fast enough for most other people I’m finding).

Anyhow it’s all good really. I do love the USA and I feel pretty comfortable here. The only shame is that I’m only here for a week. I am planning a proper holiday with the whole family late next year though, so hopefully I’ll have much longer to explore much further than I could this time around.

]]>
Technology 2019-11-05T11:30:00+10:00 #TechDistortion
John Tackles Blogvember 2019 https://techdistortion.com/articles/john-tackles-blogvember-2019 https://techdistortion.com/articles/john-tackles-blogvember-2019 John Tackles Blogvember 2019 Flicking through my RSS on the plane to the States and I came across Shawn Blanc’s post about blogging everyday during November. Since I don’t have the time or commitment to finish anything longer than a short story (hence NaNoWriMo never worked for me) then this seems to work. What the heck?

I started by back-publishing two articles that were 70% done and then yesterday wrote about the differences in travelling overseas in the current age vs when I was young.

This qualifies as a meta-post, which I ordinarily detest writing, but alas here we are anyway. The truth is that TechDistortion hasn’t been a regular blog for nearly five years as I’ve spent my time podcasting instead. Sometimes I cross the streams, mostly I don’t. There was a time a while ago where some people were encouraging novel-length podcast episode show notes but thankfully that didn’t last long and it’s not really the same thing. (People don’t read podcasts, they listen to them. Who knew?)

Whether my attempt to tackle “Blogvember,” which is it’s apparent moniker, yields anything of interest to the world at large may be judged by the masses upon its completion.

]]>
Technology 2019-11-04T02:30:00+10:00 #TechDistortion