Total of 317 posts

Herein you’ll find articles on a very wide variety of topics about technology in the consumer space (mostly) and items of personal interest to me. I have also participated in and created several podcasts most notably Pragmatic and Causality and all of my podcasts can be found at The Engineered Network.

Retro Mac Pro Part 2

I wrote previously about why I invested in a Mac Pro and I realised I didn’t describe how I’d connected everything up, in case anyone cares. (They probably don’t but whatever…)

New Desk Configuration with 3 4K Displays

The Mac Pro 2013 has three Thunderbolt 2 buses and due to the bandwidth restrictions for 4K UHD 60Hz displays, you can only fully drive one 60Hz display per Thunderbolt 2 bus. Hence I have the two 60Hz monitors connected via Mini-DisplayPort to DisplayPort cables, one to each Bus.

The third monitor is an interesting quandry. I’d read that you can’t use a third monitor unless you connect it to the HDMI port, and it’s only HDMI 1.4 therefore it can only output 30Hz at 4K. However that’s not entirely true. Yes, it is HDMI 1.4 but that’s not the only way you can connect a monitor. By using a spare Mini-DisplayPort to HDMI Cable you can connect a monitor directly to the third Thunderbolt bus and it lights up the display, also at 30Hz.

I suspect that Apple made a design choice with the third Thunderbolt 2 bus, such that it’s also connected to the two Gigabit Ethernet ports and HDMI output. Therefore whatever remaining bandwidth would be available by limiting video output to 30Hz at 4k, allows the other components the bandwidth they require. In my case it’s annoying but not the end of the world, given the next best option was about four times the price.

Top View of the Mac Pro

Seeing as how I have a perfectly good TS3+ dock, and that Apple still sell Thunderbolt 2 cables and a bi-directional Thunderbolt 2 to Thunderbolt 3 adaptor, I’ve connected those to that third Thunderbolt 2 bus, then I drive the third monitor using a DisplayPort to DisplayPort cable from the TS3+ instead. This then allows me to connect anything to the TS3+ that’s USB-C to the Mac Pro and adds a much needed SD Card slot as well.

In order to fit everything on my desk I’ve added a monitor arm on each side for the side monitors which overhang the desk, and placed the Mac Pro behind the gap between the middle and right-hand side monitors. If you need access to the Mac Pro or the TS3+ Dock, simply swing the right hand monitor out of the way.

Moving the Right-hand Side Monitor reveals Mac Pro and TS3+

Since I podcast sometimes, I’ve also attached my Boom Arm behind the Dock and the Mix Pre3 is connected via the powered USB-C output on the TS3+ and it works perfectly. Less interesting are the connections to the hardwired Ethernet, speakers and webcam but that’s pretty much it.

Retro Mac Pro

After an extended forced work-from-home mandated due to COVID19, I’ve had a lot of time to think about how best to optimise my work environment at home for optimal efficiency. I started with a sit/stand desk and found that connecting my MacBook Pro 13" via a CalDigit TS3+ allowed me to drive two 4K UHD displays at 60Hz and give me a huge amount of screen real-estate that was very useful for my job.

I retained the ability to disconnect and move into the office should I wish to, though in reality I only spent a total of 37 days physically in the office (not continuously, between various lockdown orders) in the past 12 months. When I was outside the office, I used my laptop occasionally but found the iPad Pro was good enough for most things I wanted to do and its battery life was better, plus I could sign documents - which is a common thing in my line of work.

It all wasn’t smooth sailing though. I found that the MBP was actually quite sluggish in the user interface when connected to the 4K screens, and that the internal fans would spin up to maximum all the time, many times without any obvious cause. I started to remove applications that were running in the background like iStat Menus, Dropbox, and a few others and that helped, but I still noticed that it was also spinning up now during Time Machine backups and Skype, Skype for Business, Microsoft Teams and Zoom.

This was a problem since I spent most of my workday on Teams calls and the microphone was picking up the annoying background grind of the cooling fans in the MBP. For this reason I started thinking about how to resolve the two issues: sluggish graphics and running the laptop hot all of the time, without sacrificing screen real-estate in HiDPi (of which I’d become rather dependent).

So I got to thinking: why am I still using a laptop when I’m spending 90% of my time at my home office desk? I wanted to keep using a Mac, and whilst I missed my 2009 Nehalem Mac Pro, I didn’t miss how noisy it also was, it’s power drain, the fact it was an effective space-heater all year round and frankly wasn’t currently officially supported by Apple1 anyway.

There are only a few currently supported Macs that can drive the amount of screen real-estate I wanted: the Mac Pros (2013, 2019), the iMac 5K (with discrete graphics) and the iMac Pro. There are, as yet, no M1 (Apple Silicon) Macs that can drive more than one external display. Buying a new Mac was out of the question with my budget strictly set at $1,400 AUD (about $1K USD at time of writing) it was down to used Macs. The goal was to get a powerful Mac that I could extend and upgrade as funds permitted. The more recent iMacs weren’t as upgradable and even a used iMac Pro was out of my budget and I won’t find a 2019 Mac Pro used since they’re too new and would also be too expensive (even used).

So call me crazy if you like, but I invested in a used 2013 Mac Pro - a Retro-Mac Pro if you like. It had spent its life in an office environment and for the past two years lay unused in a corner with its previous user leaving the company and they’d long since switched to Mac Minis. It had a damaged power cable, no box and no manuals and apart from some dust was in excellent condition.

I’ve now had a it for just under a week and I’m loving it! It’s the original entry-level model with twin FirePro D300s, 3.7GHz Quad-core Intel Xeon E5 with 16GB DDR3 RAM and a basic 256GB SSD. I can upgrade the SSD with a Sintech adaptor and a 2TB NVMe stick for $340 AUD, and go to 64GB RAM for about $390 AUD, but I’m in no hurry for the moment.

Admittedly the Mac Pro can only drive two of the 4k UHD screens at 60Hz with the third only at 30Hz but that amount of high-DPI screen real-estate is exactly what I’m looking for. Dragging a window between the 60Hz and 30Hz screens is a bit weird, but I have my oldest, cheapest 4K monitor as my static/cross-reference/parking screen anyway so that’s a limitation I can live with.

Yes, I could have built a Hackintosh.

Yes, I could run Windows on any old PC.

I wanted a currently supported Mac.

For those thinking, “But John, there’s Apple Silicon Macs with multi-display support just around the corner” well yes, that’s probably true. But I know Apple. They will leave multi-UHD monitor support only for their highest-end products which will still cost the Earth. So you might ALSO say, “But John, Intel Macs are about to die, melt, burn and become the neglected step-son that was once the golden-haired-child of the Apple family” and that’s true too, but I can still run Linux/Windows/ANYTHING on this thing, for a decade to come long after macOS ceases to be officially supported. That said, the fact you can still apply hacks to the 2009 Mac Pro and run Big Sur, it’s likely the 2013 Mac Pro will be running a slightly crippled but functional macOS for a long time yet, or at least until Apple give up on Intel support for Apple Silicon features, but that’s another story.

And you might also think, “John, why the hell would you buy a Mac that’s had so many reliability problems?” Well I did a lot of research given the Mac Pro 2013’s reputation, and based on what I found the original D300 model was relatively fine with very few issues. The D500 and D700 models had significantly worse reliability as they ran hotter (they were more powerful) and due to the thermal corner Apple built themselves into with the Mac Pro design at that point, ended up being unreliable with prolonged usage, due to excessive heat.

I can report the Mac Pro runs the two primary screens buttery smooth, it is effectively silent and doesn’t ever break a sweat. Being a geek however subjective measurements aren’t enough. The following GeekBench 5 scores for comparison:

Metric Mac Pro Score MacBook Pro Score % Difference
CPU Single-Core 837 1,026 - 22.5%
CPU Multi-Core 3,374 3,862 - 14.4%
OpenCL 20,482 / 21,366 8,539 + 239%
Metal 23,165 / 23,758 7,883 + 293%
Disk Read (MB/s) 926 2,412 - 260%
Disk Write (MB/s) 775 2,039 - 263%

By all measurements above my Macbook Pro should be the better machine, and you’d hope so being 5 years newer than the Mac Pro 2013. My usage to date however hasn’t shown that - almost the opposite, which begs the question - for my use case where screen real-estate matters the most, the graphics power from a discrete FirePro is far more valuable than a significantly faster SSD. Not only that but with the same amount of RAM you’d think the Macbook Pro would perform as well, however it’s using an integrated graphics chipset, hence sharing that RAM and driving two 4K screens was killing its performance, whereas the Mac Pro doesn’t sacrifice any of its RAM and maintains full performance even when driving those screens.

I don’t often encode video in Handbrake anymore or audio but when I do the Mac Pro isn’t quite as fast but it’s pretty close to the Macbook Pro or certainly good enough for me. The interesting and surprising thing to note is that a 7 year old desktop machine was a better fit for my needs at the price than any current model on offer by Apple.

I’m looking forward to many years of use out of a stable desktop machine, noting that whilst my use-case was a bit niche, it’s been an effective choice for me.


  1. An officially support Mac is one where Apple releases an Operating System version that will install without modification on that model of Mac. ↩︎

Podcasting 2.0 Phase 3 Tags

I’ve been keeping a close eye on Podcasting 2.0 and a few weeks ago they finalised their Phase 3 tags. As I last wrote about this in December 2020, I thought I’d quickly update on thoughts on each of the Phase 3 tags:

  • < podcast:trailer > Is a compact and more flexible version of the existing iTunes < itunes:episodeType >trailer< /itunes:episodeType > tag. The Apple-spec isn’t supported outside of Apple, however more importantly you can only have one trailer per podcast, whereas the PC2.0 tag allows multiple trailers and trailers per season if desired. It also is more economical than the Apple equivalent, as it acts as an enclosure tag, rather than requiring an entire RSS Item in the Apple Spec.
  • < podcast:license > Used to specify the licence terms of the podcast content, either by show or by episode, relative to the SPDX definitions.
  • < podcast:alternateEnclosure > With this it’s possible to have more than one audio/video enclosure specified for each episode. You could use this for different audio encoding bitrates and video if you want to.
  • < podcast:guid > Rather than the using the Apple GUID guideline, the PC2.0 suggests using UUIDv5 using the RSS feed as the seed value.

In terms of TEN, I’m intending to add Trailer in future and I’m considering Licence as well, but beyond that probably not much else for the moment. I don’t see that GUID adds much for my use case over my existing setup (using the CDATA URL at time of publishing) and since my publicly available MP3s are already 64kbps Mono, Alternate Enclosure for low bitrate isn’t going to add any value to anyone in the world. I did consider linking to the YouTube videos of episodes where they exist however I don’t see this as beneficial in my use case either. In future I could explore an IPFS stored MP3 audio option for resiliency, however this would only make sense if this became more widely supported by client applications.

It’s good to see things moving forward and whilst I’m aware that the Value tag is being enhanced iteratively, I’m hopeful that this can incorporate client-value and extend the current lightning keysend protocol options to include details where supporters can flag “who” the streamed sats came from (if they choose to). It’s true that customKey/Value exist however they’re intentionally generic for the moment.

Of course, it’s a work in progress and it’s amazing that it works so well already, but I’m also aware that KeySend as it exists today, might be deprecated by the AMP aka Atomic-Multipath Payment protocol, so there may be some potential tweaks yet to come.

It’s great to see the namespace incorporating more tags over time and I’m hopeful that more client applications can start supporting them as well in future.

Pushover and PodPing from RSS

In my efforts to support the Podcasting 2.0 initiative, I thought I should see how easy it was to incorporate their new PodPing concept, which is effectively a distributed RSS notification system specifically tailored for Podcasts. The idea is that when a new episode goes live, you notify the PodPing server and it then adds that notification to the distributed Hive blockchain system and then any apps can simply watch the blockchain and this can trigger the download of the new episode in the podcast client.

This has come predominantly from their attempts to leverage existing technology in WebSub, however when I tried the WebSub angle a few months ago, the results were very disappointing with many minutes, hours passing before a notification was seen and in some cases it wasn’t seen at all.

I leveraged parts of an existing Python script I’ve been using for years for my RSS social media poster, but stripped it down to the bare minimum. It consists of two files, checkfeeds.py (which just creates an instance of the RssChecker class) and then the actual code is in rss.py.

This beauty of this approach is that it will work on ANY site’s RSS target. Ideally if you have a dynamic system you could trigger the GET request on an episode posting event, however since my sites are statically generated and the posts are created ahead of time (and hence don’t appear until the site builder coincides with a point in time after that post is set to go live) it’s problematic to create a trigger from the static site generator.

Whilst I’m an Electrical Engineer, I consider myself a software developer of many different languages and platforms, but for Python I see myself more of a hacker and a slasher. Yes, there are better ways of doing this. Yes, I know already. Thanks in advance for keeping that to yourself.

Both are below for your interest/re-use or otherwise:

from rss import RssChecker

rssobject=RssChecker()

checkfeeds.py

CACHE_FILE = '<Cache File Here>'
CACHE_FILE_LENGTH = 10000
POPULATE_CACHE = 0
RSS_URLS = ["https://RSS FEED URL 1/index.xml", "https://RSS FEED URL 2/index.xml"]
TEST_MODE = 0
PUSHOVER_ENABLE = 0
PUSHOVER_USER_TOKEN = "<TOKEN HERE>"
PUSHOVER_API_TOKEN = "<TOKEN HERE>"
PODPING_ENABLE = 0
PODPING_AUTH_TOKEN = "<TOKEN HERE>"
PODPING_USER_AGENT = "<USER AGENT HERE>"

from collections import deque
import feedparser
import os
import os.path
import pycurl
import json
from io import BytesIO

class RssChecker():
    feedurl = ""

    def __init__(self):
        '''Initialise'''
        self.feedurl = RSS_URLS
        self.main()
        self.parse()
        self.close()

    def getdeque(self):
        '''return the deque'''
        return self.dbfeed

    def main(self):
        '''Main of the FeedCache class'''
        if os.path.exists(CACHE_FILE):
            with open(CACHE_FILE) as dbdsc:
                dbfromfile = dbdsc.readlines()
            dblist = [i.strip() for i in dbfromfile]
            self.dbfeed = deque(dblist, CACHE_FILE_LENGTH)
        else:
            self.dbfeed = deque([], CACHE_FILE_LENGTH)

    def append(self, rssid):
        '''Append a rss id to the cache'''
        self.dbfeed.append(rssid)

    def clear(self):
        '''Append a rss id to the cache'''
        self.dbfeed.clear()

    def close(self):
        '''Close the cache'''
        with open(CACHE_FILE, 'w') as dbdsc:
            dbdsc.writelines((''.join([i, os.linesep]) for i in self.dbfeed))

    def parse(self):
        '''Parse the Feed(s)'''
        if POPULATE_CACHE:
            self.clear()
        for currentfeedurl in self.feedurl:
            currentfeed = feedparser.parse(currentfeedurl)

            if POPULATE_CACHE:
                for thefeedentry in currentfeed.entries:
                    self.append(thefeedentry.get("guid", ""))
            else:
                for thefeedentry in currentfeed.entries:
                    if thefeedentry.get("guid", "") not in self.getdeque():
#                        print("Not Found in Cache: " + thefeedentry.get("title", ""))
                        if PUSHOVER_ENABLE:
                            crl = pycurl.Curl()
                            crl.setopt(crl.URL, 'https://api.pushover.net/1/messages.json')
                            crl.setopt(pycurl.HTTPHEADER, ['Content-Type: application/json' , 'Accept: application/json'])
                            data = json.dumps({"token": PUSHOVER_API_TOKEN, "user": PUSHOVER_USER_TOKEN, "title": "RSS Notifier", "message": thefeedentry.get("title", "") + " Now Live"})
                            crl.setopt(pycurl.POST, 1)
                            crl.setopt(pycurl.POSTFIELDS, data)
                            crl.perform()
                            crl.close()

                        if PODPING_ENABLE:
                            crl2 = pycurl.Curl()
                            crl2.setopt(crl2.URL, 'https://podping.cloud/?url=' + currentfeedurl)
                            crl2.setopt(pycurl.HTTPHEADER, ['Authorization: ' + PODPING_AUTH_TOKEN, 'User-Agent: ' + PODPING_USER_AGENT])
                            crl2.perform()
                            crl2.close()

                        if not TEST_MODE:
                            self.append(thefeedentry.get("guid", ""))

rss.py

The basic idea is:

  1. Create a cache file that keeps a list of all of the RSS entries you already have and are already live
  2. Connect up PushOver (if you want push notifications, or you could add your own if you like)
  3. Connect up PodPing (ask @[email protected] or @[email protected] for a posting API TOKEN)
  4. Set it up as a repeating task on your device of choice (preferably a server, but should work on a Synology, a Raspberry Pi or a VPS)

VPS

I built this initially on my Macbook Pro using the Homebrew installed Python 3 development environment, then installed the same on a CentOS7 VPS I have running as my Origin web server. Assuming you already have Python 3 installed, I added the following so I could use pycurl:

yum install -y openssl-devel
yum install python3-devel
yum group install "Development Tools"
yum install libcurl-devel
python3 -m pip install wheel
python3 -m pip install --compile --install-option="--with-openssl" pycurl

Whether you like “pycurl” or not, obviously there are other options but I stick with what works. Rather than refactor for a different library I just jumped through some extra hoops to get pycurl running.

Finally I bridge the checkfeeds.py with a simply bash script wrapper and call it from a CRON Job every 10 minutes.

Job done.

Enjoy.