2022: A Look Ahead




As we close out 2021 and ring in what we hope to be a bright and fulfilling year, it’s time to reflect on the trends that will likely shape the months that lie ahead of us. We live in a world experiencing major transformations and exponential trends, and we’re likely to see significant developments in the new year. 

So what might those changes be? Here are a few of my predictions:

COVID slides into the background 

Just as we were expecting the pandemic to fade away and become endemic, the Omicron strain surprised us yet again with a large number of mutations, increased virality and an ability to land the unvaccinated in hospitals. The fact that it hit right around the holiday season, causing thousands of flight cancellations and millions of upended plans, made its psychological impact even worse. But, on the positive side, this too shall pass. Successive mutations will likely become less deadly and eventually go the way of every other pandemic. Perhaps Omicron itself is the last major mutation. Time will tell, but it is likely that we see the end of COVID as an economy-stopping phenomenon by the end of 2022. 

A changed world: Telehealth, remote work, reduced social contact

While COVID will slip into the background, life will not return to the way it was in 2019. More than two years since this fiasco began, it has caused such fundamental shifts in many aspects of life, that to now expect them to dissolve and dissipate as if Thanos just snapped his fingers, is unrealistic. For example, companies are already cutting back on large real estate leases, realizing that the entire workforce is unlikely to return to work in person. JPMorgan Chase’s CEO, Jamie Dimon, is planning to retain only 60 desks per hundred employees. And according to Fortune, 74% of Fortune 500 CEOs expect a reduction in their real estate footprint.

The pandemic also made us comfortable with remote consultations with our medical health professionals. In fact, Martin Schreiber, MD, chief medical officer for DaVita Kidney Care’s home modalities says that the number of patients preferring telehealth appointments has increased from less than 40% pre-pandemic, to more than 60% now. One can imagine combining a broad range of in-home tests with online consultations to help diagnose an array of conditions. The missing piece in this equation so far is in-home testing; I expect this is an area of significant growth for the future. Specialized devices combined with AI-based smartphone apps that can detect patterns should be an exciting area to watch.

Fewer visits to the office also mean reduced social contact in general. The office is, among other things, programmed socialization. Time spent with the nuclear family will increase, but for many, the number of new people they meet in real life will diminish greatly. Arthur C. Brooks, writing for the Atlantic, reports that nearly two-thirds of the working population feels working from home is worse than being at work, and about 70% said that combining home and work was actually a source of stress. Many people do not have perfect environments at home, and for them to feel that they will seldom have an escape can be psychologically distressing, leading to resignations and poor mental health. A less-friendly world, even more challenged with mental health crises appears on the cards. 

Call it what you want, the Metaverse is on its way 

credit: getty 

GETTY

As people are driven to interactions through digital means while at home with reduced social contact, time spent online will continue to remain at high levels and increase over the long term. According to Data Reportal, the average global internet user now spends six and a half hours online every day. 

While I don't believe we will be walking around all day with VR or AR headsets any time soon, yet, one application at a time, we will dip our toes into the Metaverse. Some might argue that the idea of the Metaverse really didn't need a new name at all. And they have a point. But no point in fighting a name that is likely to stick. The “Metaverse” is how we will refer to higher fidelity, connected digital experiences within which rich social and commercial interactions are possible with both AI and humans. Another way to look at the Metaverse is as an extension of existing trends; improvements in graphics realism and gaming interactivity, the virality of social networks, the spread of AI and visualization technology, high-speed networks and vastly improved digital financial technologies. When you have the piece parts available, someone is likely to put them together. Voila! Metaverse.

Will we spend more time in the Metaverse? Yes. Despite the fact that we love to hate social networks, we vote with our attention, and are presently spending nearly two and a half hours every day on these sites. If our digital experiences and interactions become even richer as things move to the “Metaverse”, will this number go up or down? Quite likely, up. 

Crypto is the new software infrastructure 

Bitcoin pioneered the idea of a blockchain; a decentralized, unalterable, secure and encrypted store of data that includes data provenance and change histories. Think of this as a ledger no one owns, that is hard to hack but slow to transact with. If the Bitcoin blockchain was a specialized, decentralized database, Ethereum then built on these ideas to implement a "virtual machine", or a virtual computer that could run on a decentralized network. Imagine a computer cobbled together with the resources of all the individual computers that connect with each other to run the peer-to-peer Ethereum protocol. With the idea of programs (called “smart contracts”) running atop this virtual machine, one could see the early beginnings of a "world computer" emerge, whose resources would not be owned by a trillion dollar corporation, but by millions of users who profit collectively from the use of this system. How would this change the dominance of cloud vendors? How could this prevent the exploitation of individual and small business data? Overall, I think decentralized infrastructure for compute and storage is an important and growing trend, and one which can restore some balance to a tech ecosystem where three or four giants represent such a huge percentage of the entire sector’s market capitalization.

 The innovation stemming from the crypto ecosystem is not limited to blockchain and a P2P "world computer" that can run code. There's much more, such as side-chains (Layer 2) that accelerate the slow performance of blockchain, new blockchains (Layer 1) that are inherently faster, non-fungible tokens that can represent an individual instance of an asset, like paintings and land (and unlike currency, where one $1 note is equivalent to another $1 note). Also, there is the idea of automated governance implemented via algorithms instead of people and management teams; Decentralized Autonomous Organizations. 

 On top of all these innovations, of course, is the currency - or the token - itself. By giving users a token you can incentivize those who hold a shared belief in what you are collectively building. The incentive accrues from a belief in future value and not from the expenditure of mountains of capital. Incentives exist in every industry and are used to accelerate growth. All of us have seen those "buy one, get one free" signs, been offered $50 in free stocks when we sign up at a brokerage, or enticed with a free ticket. But all these incentives generally require the expense of a hard currency and can only be provided by those who already hold a large amount of capital. Entrepreneurs who are just starting out, startups and smaller companies may not be able to match this style of land grab via "dumping", and are hence at a disadvantage even with a superior product. 

 However, if users believe in the superiority of the product being produced by these entrepreneurs and accept a "token" that enables economic exchange within the product or ecosystem, they may engage with the product and help it grow. They would be incentivized by the knowledge that in the future, as usage increases the value of the token they hold will also increase. Today, the user of a search engine or social network, for example, gets no such benefit. If you were an early adopter and helped build a monopoly, then, in the immortal words of Douglas Adams, “So long, and thanks for all the fish.”

Developers are moving to crypto ecosystems in droves, and this will transform the future of application development, and indeed, the tech industry. This is a space rich with new ideas and many smart builders who will ensure that their work meaningfully impacts the future. In 2022 and beyond, a growing number of  applications will be built on "crypto rails". 

No-code, human-AI collaboration and automated idea generation continue to improve

Personal computers first came out in the late ‘70s and achieved great popularity by the '80s. Early systems such as the Commodore 64 or TRS-80 would boot straight to a BASIC programming environment as soon as they were turned on. Kids, parents, and indeed all users were expected to be programmers as much as they were end users. Pre-packaged commercial software was not as abundant as it is today and computer magazines would often include listings of BASIC programs that had to be painstakingly keyed into the interpreter before they would run. School curricula were designed around the assumption that using computers meant programming in a high-level language like BASIC. BBC computers and Acorn RISC machines were distributed to schools across the UK. Apple IIs and later, IBM PCs, were bought by schools in the US.

No comments

intech company. Powered by Blogger.