The challenge  of writing a book about promising emerging technologies is that often they really were emerging during the book’s research phase. By the time the book comes around in print some have may emerged successfully to widespread usage, while others end up struggling to cross the chasm and their promise of market execution didn’t quite fructify.

As an industry analyst sitting in the heart of Silicon Valley, I have been fortunate enough to sit through thousands of exciting demos and witnessed numerous product launches and life cycles of emerging technologies – some long and fruitful, some (sadly) very short. But then the game of emerging technologies game is all about product development challenges, market acceptance risk, timing accuracy and more frequently the uncertainty of market execution . Investors backing these early technology plays choose to place multiple bets on future outcomes, and even back multiple horses. Cashing out big time on the ones that panned out while moving on from a vast majority of the remaining duds in the portfolio that didn’t.

Author David Rose picks one such emerging capability with a lot of early promise. He has grouped many related and diverse concepts and technologies that have potential to work together delivering tremendous benefits for consumer as well as industry, under the broad capability he calls “SuperSight“. And the term is quite apt for what he goes on to describe in his fascinating new book with the eponymous title, while cutting a wide swath through the universe of possible use-cases and application. To me, these technologies span an interesting array touching upon augmented and virtual reality, remote ubiquitous connectivity & computing, image analysis, artificial intelligence, speech and facial synthesis, edutech, autonomous industrial operations, agritech, intelligent city management, health-tech, essential services delivery, law & order, health & safety, not to mention the more obvious ones from entertainment & gaming, sports-tech, even design & architecture. That’s just a few from our research universe.

He has taken pains to organize them into chunks of capabilities that have potential to deliver tangible benefits for humanity. The book is written to be an easy entertaining read while balancing the challenge of untangling the immense underlying complexity and risks. He describes them just enough for the lay reader to get the hang of it without getting lost in details.

We are also taken on a brief tour of the author’s alma mater MIT Media Labs, truly the mother of many consumer innovations. Equally impressive is his own achievements with developing and commercializing truly leading technologies via his many startups and exits. We are treated to a flavor of the culture that pervades MIT Media Labs. The reader gets an inside track to experience why this innovation think tank is truly a magnet and home to many of the world’s leading technology thinkers and developers. How they live and breathe life into a world that’s still over-the-horizon for most people. How they make very distinct schools of thought collaborate so effectively to design and speculate on futures that would still take decades to reach mainstream.

David Rose also does an excellent job balancing the exciting possibilities of SuperSight with some of their potential pitfalls. I gravitated to a few of the downsides that he warns about towards the final chapter. They got my attention because they are important for any lay user of such technologies to not take lightly. It concerns me also that some of those pitfalls may well fall into the cross-hairs of regulators and  social activists, well before their futures have had a chance to play out and be validated. We have seen enough instances by now how a promising internet technology called social media that promised to democratize and level the playing field of information access ultimately reached a point where algorithms came close to upending democracies around the world and created irreparable ideological polarities in societies themselves to the point that the lay social media user fell prey to unscrupulous political manipulations. I strongly recommend spending time to ponder over the thoughts shared in this chapter.

At this point, I leave the reader to savor the richness the author serves up in his book, experience the fun of his storytelling style, and observe how he paints an exciting picture of the future. To say more here would be to spoil that fun.

While the potential of SuperSight in the consumer market space is immense, one does observe the book barely scratches the surface of potential economic value in industrial markets that we track as industry analysts.

I am tempted to digress and pick a few select industrial SuperSight related areas that we see as highly complementary to his ideas. We pick Intelligent Cities, industrial Autonomy and Metaverses from our research universe where see much larger addressable markets. (Perhaps I might even coax David Rose to consider a future sequel to his book that examines these exciting opportunities the business-to-business markets open up.)

 

INDUSTRIAL AUTONOMY –  SuperSight In Plant Operations

A few technologies we examine closely in industrial markets.

Digital Twins act as a repository of digital information that is indistinguishable from their physical counterpart. Consisting of three main parts: physical products in real space, virtual products in virtual space and the connected data that tie the physical and virtual products together, digital twins serve to remotely provide insights into operational health of industrial equipment. Virtually any industrial/commercial entity – individual equipment assets, entire fleet of equipment, an entire processing unit, an assembly line, a commercial building, even a human body. The virtual worlds are indistinguishable from the real world even to the machines and humans operating within them. Sensors inside of digital twins—everything from lidars to radars to cameras—are trained to obey laws of physics in keeping with this definition of reality.
An interesting use-case is the building information model based digital twin that allows one to drill into a built space and observe granular details. The generally static building information model when connected to a real-time monitoring system allows one to observe the current state of building subsystems, how they’re being impacted by occupant behavior, when assets like HVAC or lighting might fail, and much more. It’s a model that evolves over time to deliver more value with each new stage of the asset’s lifecycle.

Remote analytics with Industrial-IoT:   thyssenkrupp (the German elevator maker) is one example of a business model where service availability is the sole visible manifestation of heavy machinery performance. Its a business where the ability to merge transparently into the lifestyle of modern city dwellers is the best assurance of product design and user-interface design excellence. As one of the world’s leading elevator manufacturers, thyssenkrupp Elevator maintains over 1.1 million elevators worldwide, including those at some of the world’s most iconic buildings such as the new 102-story One World Trade Center in New York (featuring the fastest elevators in the western hemisphere) and the Bayshore Hotel in Dalian, China. Over 150,000 of its elevators in US are connected through its data hub MAX that monitors and reports all of their actions from the movement of the doors to the error codes, at minimum, every 12 hours.
500,000 elevators and escalators that are added in China alone each year. This type of growth could put a lot of strain on their ability to train enough technicians in these markets to meet demand. However, the ability to use machine learning and remote diagnostics has helped augment employee talent while reducing the amount of time needed for training. With MAX, data collected in real time from millions of connected thyssenkrupp elevators is sent to Microsoft’s trusted Azure cloud platform, where an algorithm calculates the remaining lifetime of key systems and components in each elevator.
Changing the approach from a reactive path – remote monitoring activities relying on a device to tell you an error has occurred – into a predictive path using IoT which enables you to begin thinking about what’s happening in real-time and over a distribution of time curve. This lets you see, through constant improvement, the efficiency of the elevator and lets you know when predictive or preemptive maintenance is needed. Using HoloLens, 24,000 elevator service technicians can now visualize and identify problems ahead of a job, and have remote, hands-free access to technical and expert information when onsite, improving the way people and cities move. All of this is resulting in significant savings in time and stress. Initial field trials have already shown that a service maintenance intervention can be done up to four times faster than before by using the device.
This operational efficiency and equipment optimization capability alone is expected to deliver an $1.4 – $1.5 trillion annual impact.

Autonomous Plant Operations:   Optimization of a process plant (an oil refining unit, power generation unit, a chemicals manufacturing operation, a transportation pipeline of volatile hydrocarbons) is a highly complex task where a large number of controllable parameters affect production in one way or other. Somewhere in the order of 100 different control parameters must be adjusted to find the best combination of all the variables. Machine learning-based prediction model provides us “production-rate landscape” with its peaks and valleys representing high and low production. The control model relies on AI to handle the massive input of data from millions of sensors and onsite cameras at such plants. The multi-dimensional optimization algorithm then moves around in this landscape looking for the highest peak representing the highest possible production rate. By moving through this production rate landscape, the algorithm can give recommendations on how to best reach this peak, i.e. which control variables to adjust and how much to adjust them. These algorithm driven control decisions provide SuperSight into the complex dynamics of chemical processes at a level of detail that would be impossible to monitor/observe at close range due to safety reasons. Additionally, it can generate early warning of deteriorating process conditions (even deadly situations) that could allow plant engineers to avert without putting any personnel and assets at risk.

INTELLIGENT CITIES  – SuperSight To Manage City Operations & Deliver Citizen Services
An evolving opportunity that has frequently stumbled on the path of achieving its expansive vision. A classic case of over-promises and under-execution often driven by factors beyond what the technologies themselves can deliver.
However SuperSight type capabilities have been used quite effectively in many interesting use-cases working to make cities intelligent.
City utilities have used AR & geo-mapping software to merge the roadview with a map of pipelines and electrical cabling buried deep underneath them. You can use AR to “x-ray” the ground to see pipes, wiring, or other infrastructure that isn’t otherwise visible. Toms River Municipal Utilities Authority (TRMUA) in Toms River, New Jersey maintains 435 miles of underground utility lines, 18 pump stations, and serves 47,000 ratepayers covering an area of 55 square miles in Toms River, NJ (population 92,000). TRMUA used a platform created byToronto based Meemim vGIS to enable the technology integration between ESRI’s ArcGIS, Microsoft Azure and Holo Lens. This enhanced the the user experience of the field technicians by allowing them to see concealed utilities while remaining situationally-aware of their actual surroundings.
The state of California leverages satellite imagery provider Planet Labs’ medium-resolution monitoring constellation, augmented by pinpoint, high resolution satellites, to give fire fighters the data they need to mitigate the threat posed by fire. The California Forest Observatory a collaboration between Planet, Salo Sciences, and other partners creates and distributes statewide maps of forest structure and available fuel. These maps serve two main purposes. By showing where forests are vulnerable, they can help foresters plan efforts to remove excess fuel and make the landscape more resilient to fire. They can also improve the models of fire propagation that firefighters use to plan the response to a fire in progress.
The Turkish Republic’s Ministry of Food, Agri. & Livestock partnered with Airbus to create a reference information system accessible for all agriculture stakeholders within the country. The objective was to provide them with continuous yield forecasts at parcel scale and support them in optimizing water, pesticides and fertiliser applications. The spatial data is sent in near-real-time to ministerial technical units for conversion into solid vegetation maps (NDVI), calibrated with the ground information collected live by more than 15,000 sensors spread across the entire country. This is shared across farms prior to harvest season.

METAVERSES – SuperSight To Anticipate Outcomes In The Physical

Its kind of interesting that this book came along as we were researching the market potential for “metaverses”. Overnight, a new buzzword has taken birth, and promises to be the more expansive “avatar” of augmented reality.

What Is The Metaverse?    Metaverses may be placed in a market category termed popularly as “human co-experience”. Where people leave their physical worlds and enter a parallel world unconstrained by time, space or resources, to collaborate, communicate and create assets that often have real value in the physical iteration.
Some such as Tim Sweeney of Epic Games takes a narrower view by describing their highly successful Fortnite game experience as a metaverse because it’s a virtual 3-D space that mixes gaming and non-gaming elements.
Will metaverses be one open playground for creation. Will they be walled gardens for creation of individual profit pools for the owning companies.
What are the legal implications?
Who owns the rights to a “digital twin” of a metaverse asset such as your own home in the metaverse?. One one that may be being bought and sold without any of your involvement, and may even have appreciated in value way above that of your physical home.
Are you legally entitled to any of that appreciated value? Worse, what if the owner of that virtual replica of your home decides to claim ownership of your real home? How many digital replicas exist of your property?
NFT owners of your digital property may choose to squat on your ownership rights and force you to fork up sizeable amounts to reclaim it.

How Does One Build A Metaverse?     The promise of a blended future where our physical and digital worlds collide has been teased over decades of mostly hyped-up virtual reality bulky headsets and unfriendly user-experiences with software. At a time when people are demanding technologies to increase connectivity and presence, the present iterations of AR/VR are yet to come up to expectations.
One could imagine the future metaverse as a vast arena that brings together many technologies and markets. Clearly building a fully immersive metaverse experience requires specialized hardware and software technology to mesh a lot of moving parts. It will demand the most advanced chips and computer hardware, videogame development, cloud computing, digital rendering, content creating.
Virtual reality and augmented reality are merely an entry point to an exotic mix of applications across NFTs, artificial intelligence, 5G networks, edge computing, and cloud computing. It will create vast new opportunities for brands, advertising and digital currencies. Its probably better to think of it in terms of sub-metaverses that serve specific markets in gaming, enterprise and elsewhere. And the entry window to that opportunity must be opened up initially by smartphones, followed at some point by well-designed headsets with attractive price-points..

How Does One Create & Deliver New Value With The Metaverse?     There won’t be a single globally integrated metaverse. There will be multiple multiple metaverses built around a group of related use-cases and with relevant partners bringing specific capabilities and strategic expectations. Depending on who one talks to, metaverses will no doubt emerge as platforms envisioned and designed to further self serving monetization strategies.
Facebook (now Meta Platforms) kicked off buzz creation with its rebranding. Its POV certainly leverages a strong social element, as well as augmented reality with its Oculus hardware brand. But it has also placed quiet bets on startups building new forms of monetizable AR applications such as in health workouts (acquired Unit 2 Games, which makes the Crayta collaborative-game creation platform; Bigbox VR, developer of a popular game for Facebook’s Oculus VR goggles; and VR game-maker Downpour Interactive., as well as AR/VR game makers (Unit 2 Games, which makes the Crayta collaborative-game creation platform; Bigbox VR, developer of a popular game for Facebook’s Oculus VR goggles; and VR game-maker Downpour Interactive.)

nVidia, another company leading the charge has its eyes on the software & hardware infrastructure necessary to enable experiences. The company has been at the center of building our blended future through engaging a massive developer ecosystem and addressing practical use cases for collaboration, digital twins and a platform for developing life-like avatars that can exist in a digital world. It is building capabilities to its collaborative engineering and modeling metaverse platform. That includes the ability to build digital twins of cars, robots and other real-world projects to help speed development. Its flagship design platform Omniverse has already demonstrated capabilities and tools to help design and faithfully replicate real worlds inside of its simulated universe. A new Replicator service that will allow developers to seamlessly intertwine real-world and fully synthetic data enabling developers to fully simulate physically accurate 3D worlds. nVidia also announced a Replicator service that will allow developers to seamlessly intertwine real-world and fully synthetic data enabling developers to fully simulate physically accurate 3D worlds. It recently grabbed headlines with the launch of Omniverse Avatar,a platform for generating interactive AI Avatars, built tying together a number of core Nvidia SDKs including AI, Speech, computer vision, NLP, simulation and recommendation engines.
Other companies such as Verizon, Qualcomm & Unity software are envisioning yet other opportunities around mobile connectivity and gaming/animation centric use-cases.

How Does One Value The Metaverse Market Opportunity?       Don’t bother asking analysts for market size projections. They will be compelled to throw a number at you.  The number will invariably play out to be wrong. That clearly hasn’t stopped anybody from touting proprietary market sizing numbers and republishing them for PR reasons. Hoever it helps to think in terms of sub-metaverses that serve specific markets in gaming, enterprise and elsewhere.
Roblox CEO David Baszucki (an early metaverse experimenter in videogaming), meanwhile, projects a total addressable market of around $200 billion or more each in the categories of mobile, streaming content and social media. Pitchbook (a stock sell-side group) believes the current market size of virtual and augmented reality combined in 2022 is approximately $32 billion.

Here’s one other way to think about the broad macro valuation of Metaverses. Take the baseline size of the entire US economy, the value of it’s entire aggregation of industry sectors. Now take out industrial sector that support the real infrastructure of physical movement, survival and comfort – transportation, housing, oil and gas, power, construction, food and beverages, and everything else you can think of. Be sure to count out spectator sports, consumer goods & apparel, fashion, hospitality, as well
What you are left with is software and internet technologies, communications, media and entertainment, fintech.
Now go back to all the sectors we took out. Each of these sectors have elements that lend themselves to experiences in a virtual universe. Verizon recently demonstrated the potential of its volumetric video feeds from live sporting events. Oil & gas has very specialized simulation & visualization requirements of oil exploration data. Power, construction, food and beverages too have requirements of design simulation of their facilities and new projects before they commit money and resources to build them. & ofcourse the “virtual” advertising/promotion opportunity. This also opens doors to additional technologies as lidar scanning, photogrammetry, 5-D modeling, etc.

Autonomous mobility probably has the most clearly visible opportunity. Design and simulation of car chassis to predicting behavior of electric batteries under various temperature conditions to the performance of EV power-trains to evaluating their aerodynamic aspects under wind conditions and at various speeds. Generating terabytes of synthetic driving data by making virtual cars drive themselves on virtual roads in virtual cities and artificially created weather conditions that could then be used to train neural-net algorithms to build the most responsive autonomous car & truck models. Lets imagine virtual try outs of our exact body and facial characteristics for new fashion outfits and accessories, and then strut them in a virtual runway to virtual audiences to gauge instant reactions.

The industrial opportunity of metaverses is best described by Jensen Huang of nVidia, one of the frontrunners in tooling their creation. Metaverses are “a vision of connected, virtual reality universes that go beyond facilitating recreation and commerce. Businesses can lean on the metaverse (or omniverse, Nvidia’s preferred term) to reduce wastefulness and increase operational efficiency. By doing that, we could decrease the amount of waste, and that’s the reason why the economics are so good for companies,They’re willing to invest a small amount of money to buy into this artificial intelligence capability but what they save is hopefully hundreds and hundreds of billions of dollars” . Undoubtedly, the company’s chips will play crucial roles in the computing devices that will run the metaverse.

 

 

SUPER SIGHT – The Promise & Perils Of Insight From A New Reality