Categories
Uncategorized

1999-style hubris

The market high of March 2000 was obviously a peak of insanity; less obvious but more important, it was also a peak of clarity. People looked far into the future, saw how much valuable new technology we would need to get there safely, and judged themselves capable of creating it. We still need new technology, and we may even need some 1999-style hubris and exuberance to get it.

– Zero to One, Peter Thiel.

Categories
Uncategorized

Discovery on Twitter – Part 2

(Part 1)

Processing interesting stuff on Twitter

Whenever I find anything that’s interesting, I typically send it to Instapaper, which is where I do my reading. If it’s an article linked to in the tweet, sending to Instapaper is simple enough. If it’s the tweet itself, Email Tweet does the job. If it’s a Twitter thread, though, it’s not straightforward. Twitter’s own app can only email a single tweet:

Tweetbot can email a thread, formatted quite well, but maddeningly it formats it reverse-chronologically, which is not how you want to show conversations:

This is unfortunate because the app itself formats it correctly.

I also add topics to my own Collections via the fantastic app Charm, which we have written about briefly before. It’s curious Twitter doesn’t expose Collections more. They do so much more than Bookmarks, which is the closest analog, though a pale one. As we have said in the blog post on Charm, there can be many Collections, and you can order the tweets within each.

I do not know where I will use these Collections yet. They will probably be embeds on future interest-specific pages on this website. 

I use Twitter in more ways than I had realised. The main timeline, Lists and Topics for discovery. Plus curation via Collections. I look forward to moving from discovery to more engagement with other people on Twitter in the years to come.

PS: The venture capitalist Nick Grossman has created an interesting way to hijack Twitter as a commenting system instead of an on-blog system like WordPress’ own or Disqus. Here is the description. This is the sort of hack that I love the web for.

(ends)

Categories
Uncategorized

Discovery on Twitter – Part 1

My source of news and articles has always been my RSS feeds. I began setting my OPML file up over 15 years ago, and I have ship-of-theseus-ed it until today, from RSS reader to RSS reader.

To my RSS feed reading I have added newsletters. Gmail rules route them into a folder structure more or less like RSS. I like that just like I can host my OPML anywhere and use any RSS reader, I can sign up to newsletters from any email address and use any email client to read them.

Over the past year, though, Twitter has emerged as a third pillar of how I get daily information. I don’t directly follow a lot of people, but two other features of the Twitter service have transformed my experience:

One, Twitter lists

I use Lists for deliberate information consumption. Lists have existed for a long time, but it’s only in the past several months that I have created a lot of topical and interest-based private lists. So far, Twitter doesn’t show related content in lists the way it does in the main timeline – content such as ‘so-and-so liked’, or ‘so-and-so follows’ or ‘so-and-so received a reply’. Like earier times, it’s only a reverse-chronological list of tweets from people on that list. I like that third-party Twitter clients are able to support Lists, so I can read them distraction-free in Tweetbot.

Two, Twitter topics.

I’ve used this for even less time than lists, but it’s a promising discovery mechanism. Topics are only available in Twitter’s own app, which is a shame. At first I would rely on the topics Twitter suggested on the discover tab, but that selection turned out to be quite atrocious, so I picked a set from ‘Topics’ in the app’s main menu. 

The verdict is that it’s just good enough to keep using, but it’s not great. It’s promising, and I hope Twitter gets good at this. There is incentive for them to do so: it’s where they display ads. Items from topics are displayed on the Discover tab, where there’s a large ad at the top and promoted hashtags to scroll through before you get to the actual content.

(Part 2)

Categories
Uncategorized

Never let a crisis go to waste

About the Rural Electrification Agency that the American president FDR set up during the Great Depression to connect the remaining 90% of rural America to the electric grid:

The REA didn’t just barge into rural towns, install electricity and leave. Instead, the REA took two critical steps to make it a truly transformative agency: (1) it employed and empowered community members and (2) it taught people how to make the most of their newfound light. On step one, the REA relied on local partners to organize cooperatives that would supply the labor to build their community’s own electricity system…

America needs Universal Basic Internet. Achieving UBI requires a coordinated federal effort that, like the REA, (1) involves local stakeholders as a way to increase employment and civic pride and (2) creates an “internet circus” of trainers to increase digital literacy around the nation. This approach to closing the digital divide will go a long way in helping the nation recover from the COVID collapse.

COVID-19 shows we need Universal Basic Internet now

This is the best way to not let a crisis go to waste. Also the best time to undertake transformational social programmes. Regulatory reforms. Hard national conversations about issues.

Categories
Uncategorized

Cocaine

Agree totally. My view of most social media, Twitter especially, is that it’s the digital equivalent of cocaine. People first do it because everyone else does. Then it makes them feel witty, confident and popular. Over time, it turns them into obnoxious creeps. In the end, it leaves them out of a job and feeling very sorry about all the people they’ve hurt.

– “Coronavirus Is Making Us Nostalgic for 1999

Categories
Uncategorized

Microsoft and Open Source: the admission and the rebuttal

Microsoft’s president Brad Smith, this week:

“Microsoft,” said Smith during a chat hosted by MIT’s Computer Science and Artificial Intelligence Lab (CSAIL), “was on the wrong side of history when open source exploded at the beginning of the century and I can say that about me personally.”

Eighteen years ago, I wrote a rebuttal to an article that argued against (!) the use of open source software in government departments in India. That article was written by Microsoft’s then-head of marketing. 

At times, it stretched one’s imagination a bit:

[Open Source’s] role in discouraging the development of commercial software threatens to undermine intellectual property, stifle innovation, and limit entrepreneurism while reducing choice in the market.

It questioned the security of open source software in general:

Moreover, once free software is installed, it also becomes a source of elevated security vulnerabilities for IT buyers, because the source code is freely available: no one person is responsible for it.

and ended with this flourish:

We believe that software has commercial value and attempts to render software free will ultimately undermine the software industry, causing less R&D to go into software development and ultimately less innovation for consumers.

It incensed me as a teenager then, and it amuses me now in my thirties. Revisiting this, it’s clear how spooked Microsoft was even then about Linux in particular and open source in general. 

Microsoft’s nightmare of Linux on every desktop never came to pass. And yet they were right to have been fearful, because they came very close to irrelevance. What happened to Microsoft is a classice example of how disruption is rarely just a better version of your product, it’s something else altogether.

In Microsoft’s case, the operating system has mattered less after consumer and enterprise applications moved to the web – Salesforce was probably the first major ‘cloud’ enterprise application and Gmail the first major consumer one (notwithstanding that webmail including MS’s own Hotmail existed before – Gmail changed email forever). Developers were now building for the web, both on the front and back ends, not for the desktop.

And then in 2007 the desktop ceased to matter. First with the iPhone and then with Android, one based on UNIX and the other on Linux, people’s primary computer completely changed. With that, Microsoft’s dominance in browsers was wiped out. Its search engine Bing, built into Internet Explorer, mattered little after Chrome and Safari were baked into these new computers, both defaulting to Google’s search. Microsoft’s already diminishing hold on application developers crumbled as people built apps for these new ‘smartphones’ and listed them on closed ‘app stores’, out of Microsoft’s influence. 

In response, Microsoft skated to where the puck was with Windows Mobile, but even then found that its primary distribution channel for Windows and Office, the enterprise, didn’t matter in the new world. Phone manufacturers had either begun to build their own OSes in response – Bada at Samsung. LG’s acquisition of webOS from HP. So it bought its way into a new distribution world buy buying the dominant smartphone maker of the previous decade, Nokia. But it was much too late. Apple soaked up the profits and Android the volume – a year ago, the latter reported 2.5 billion monthly active users. Microsoft, once with a 90%+ share of the operating system market, claimed 1.5 billion ‘active’ devices in October 2019.

Linux, in the end, won. But not open source. And not Microsoft.

Categories
Uncategorized

Stablecoins, reserve requirements and units of account

The Financial Times on the inherent reserve requirements of the Facebook-led stablecoin Libra:

Furthermore the association fails to address how such a buffer will be maintained if the liquid securities it needs to invest in become negative yielding?

… even if Libra’s system did create cost savings which could then be passed on to customers, those extra-margins would be even more sensitive to negative interest rates than banks’ own net interest margins.

While banks have mostly not passed on negative rates to retail deposit holders en masse, they have been able to pass them on to subgroup account holders made up of corporate and high-net worth entities. Banks can also offset some of the net interest margin erosion from negative rates by continuing to lend at more positive rates in other sectors, something the Libra Reserve would not be able to do.

The only way to make this work is to keep the reserve in, well, reserve, but to invest it in short term securities, which is exactly Libra’s plan. In which case, as we have seen nearly a year ago,

That since the underlying basket includes short term securities, that Libra is also a security, not a currency as FB describes it. It behaves more like a (stable) ETF, and so a transfer is a buy/sell transaction that should attract (minsicule) capital gains.

But it’s not a reserve, then. Maintaining the stability of such a large investment, albeit into short-term securities, will a constant challenge. Banks with their fractional reserve and ability to make loans are going to be at an advantage.

Eventually there will a decentralised currency that is its own unit of account, free of these constraints that’ll truly combine movement of information with movement of money, that’ll be immutable, non-inflationary and, when required, pseudonymous. That’s going to take a world free of, or parallel to central banks. Of course, cryptocurrency won’t be close to the most interesting bit of such a world.

Categories
Uncategorized

Chats, archives and memories

Email today is now less a communications medium than a communications compile target. It’s a clearinghouse technology. It’s where conversations-of-record go, where identity verification happens, where service alerts accumulate, and perhaps most importantly for publishers, where push-delivered longform content goes by default. It is distributed and federated, near universal, and is not monopolized by a single provider. 

– A Text Renaissance

The move to messaging is a pity precisely because we assume everyday chats are not ‘conversations of record’. I can easily look up conversations I had ten, fifteen years ago not only because they took place over email but also because Google Talk conversations too became their own (read-only) threads in Gmail. Conversations that are merely a year old are hard to look up because search in messaging apps isn’t as good as with even the most common email apps.

As for archiving your conversations in an open data format – Whatsapp and Telegram have text export, but it is an entirely manual process, to be done chat-by-chat, with no way of exporting a date range. For iMessage, well – this thread says it all:

First, make an iTunes backup, then find your iTunes backup folder:
macOS
~/Library/Application Support/MobileSync/Backup/
Windows
%appdata%/Roaming/Apple Computer/MobileSync/Backup
There will be one randomly named folder, except you have backed up more devices with iTunes. In that folder, there is database file with your messages.
You can open it using free SQLite database viewer and/or export the contents to CSV file format.

I’m not sure what export is like with Google’s existing chat apps, or others like Slack.

The message is clear – chat, where the vast majority of today’s conversations take place, is meant to be ephemeral. With a contact, we share pictures, break personal news, make confessions, share witticisms and inside jokes. The sum of all of that is our relationship with that contact. Today’s banalities are tomorrow’s memories.

We live in a time today where that relationship can be archived and searched through, where we can go back to them and delight and sober ourselves, unlike any time and era before us, and we unknowingly let that opportunity pass.

Categories
Uncategorized

Packet switching, cloud gaming and good design principles

I must quote this rather large section of this excellent overview of“cloud gaming”:

In practice, players might not even see the same visual explosion, even though the “same” explosive hit the exact “same” tree at the exact “same” angle at the exact “same” time, and the exact same logic was applied to process the cause and effect. This variation doesn’t matter, and it’s odd to consider how a shared experience can vary with such trivial items. However, this inconsistency exists precisely because most information in an online multiplayer game (or shared simulation, if you will) is trivial. What matters are consistency, fairness, and technical resiliency (i.e. playability). Anything that isn’t core to that experience is deprioritized in order to preserve processing power for what does matter.

In truth, all players are looking at slightly out-of-date positions of one another – which means their local machines are processing the cause-and-effects of these out-of-date positions. Most of the time this doesn’t matter, but it often does. In this case, the online server determines the “truth” and sends out information to re-align all players. Most of these changes happen so fast that players will never even know. Indeed, there are many neat tricks to cover this up (e.g. consoles will often predict what is likely to happen and visualize it before it occurs, then skip frames or even “rollback” an in-process visual to “correct” the simulation).

Because of [latency inherent to the packet-switched nature of the Internet], multiplayer was designed in the early 1990s with the packets as small as possible and as much information as possible redundant. Multiple “truths,” in other words, are a feature, not a bug. For example, in a racing game, Player 1’s console will assume Player 2’s car will keep moving forward at a predictable speed. If Player 1’s console then learns Player 2 actually pressed the brakes suddenly, it will stop rendering the car as originally predicted, and suddenly Player 2’s car will appear in its “correct” position, with no interim video frames. The benefit of this is that if any player’s data/internet is interrupted, the game is able to proceed and no cars suddenly stop or disappear. If such tricks didn’t occur, real-time games (versus, say, card games) would be unplayable.

– Cloud Gaming: Why It Matters And The Games It Will Create

For all the techniques that are described here to present as consistent a game experience as possible, the packet-switched nature of the Internet is extremely important. Game creation would be much simpler if this were played over point-to-point connections. There’d be far fewer problems with consistency, close to none. The whole game would be much faster on less capable hardware.

But it would be impractical. It would be far too fragile. Making the Internet both packet-based and packet-switched was the most consequential decision in its design, in my opinion. It made it incredibly resilient [1]. This was necessary when early users logged in to remote systems over the internet over a terminal. Without packet switching, a momentary connectivity issue would cause them to be logged off their system.

Today see it in action when you switch from wifi to mobile data and back in the middle of a Zoom call and not have that affect the call. In point-to-point systems like the PSTN, you’d be dropped off the call, or the game instantly.

Beyond video calls and gaming, e-commerce would be hugely unreliable, because online payments rely on hops between multiple domains, and without packet switching, a momentary problem on any of those domains would cause the whole payment to fail. Downloading files would be unreliable as well – a problem at any point would cause the download to instantly fail. Ditto with uploading, such as publishing an article to your site or a video to Youtube.

Stepping back further still, as we appreciate this aspect of the Internet’s design, we should also appreciate good design principles in general. And when we look at well designed systems, also consider what they optimised for, like the Internet did resilience.

[1] It’s also made it independent and decentralised. Both of which are critical to its continuing existence as a global human communication substrate.

Categories
Uncategorized

Diversity on the Internet

WIRED studied 10,000 websites and found there’s been a steady reduction in the diversity of website design.

… the internet is a shared cultural artifact, and its distributed, decentralized nature is what makes it unique. As home pages and fully customizable platforms such as NeoPets and MySpace fade into memory, web design may lose much of its power as a form of creative expression. The Mozilla Foundation has argued that consolidation is bad for the “health” of the internet, and the aesthetics of the web could be seen as one element of its well-being.

And if sites are looking more similar because many people are using the same libraries, the large tech companies who maintain those libraries may be gaining a disproportionate power over the visual aesthetics of the internet. While publishing libraries that anyone can use is likely a net benefit for the web over keeping code secret, big tech companies’ design principles are not necessarily right for every site.

I think there is more opportunity for website diversity now than ever before.

Take for example that there are thousands of WordPress themes in the official gallery alone, leave alone custom ones. Each of these themes can be customised in many ways at different levels: from the options WordPress offers, the theme itself offers and, like in the days of old, editing the code itself. All of this is true of other content management, site-builder and store-builder platforms too. It’s another issue that most people don’t bother with much customisation, and an even bigger one that the vast majority of people that write do so on social networks, or on hosted writing platforms like Medium.

There’s more to it, though. The writers of the article are right when they say the internet is a shared cultural artefact. But the web is only a part of that. Further, the diversity they measure is visual, which is but one aspect of communication.

There is a tremendous explosion in creativity in communication in the form of email newsletters – not the web, but certainly the internet. Not as visually expressive as a website, but diverse nevertheless.

Then, for all its addictiveness, the app-only Instagram has allowed millions of people to express themselves visually, but through photographs. The web had its chance with Flickr, Photobucket and numerous other sites. 500px exists, but its few million users pale in comparison to Instagram’s billion plus.

We don’t think of podcasts when we think of the internet, but it’s a huge part and a definite cultural artefact. Podcasts have exploded in the creativity of topic, of format, of monetisation, even podcast players themselves. Consider a podcast as an aural instead of a visual blog or website, and the diversity is immediately obvious.

Once you expand your perspective of what the Internet means, what diversity means, you realise that even with the convergence in the design of websites, the Internet as a whole has exploded with diversity. It’s more vibrant now than ever before in its history.