Posts

The integration of cognitive digital twin technology with the Internet of Things (IoT) has the potential to revolutionize the marketplace by providing companies with valuable insights into their products and processes.

What is Cognitive Digital Twin Technology?

Cognitive digital twin technology is a virtual model of a physical system that uses data and artificial intelligence (AI) to simulate and predict the behavior of that system. This technology combines data from sensors and other sources with machine learning algorithms to create a digital representation of a physical system.

A cognitive digital twin model can be used to monitor and analyze the behavior of a system in real-time, and it can be used to simulate the behavior of that system under different conditions. By using this technology, companies can gain insights into the performance of their products, optimize their operations, and reduce maintenance costs.

What is the Internet of Things (IoT)?

The Internet of Things (IoT) is a network of physical devices, vehicles, home appliances, and other items that are embedded with sensors, software, and other technologies that enable them to connect and exchange data with other devices and systems over the Internet.

IoT devices can collect data from their environment, such as temperature, humidity, and pressure, and transmit that data to other devices or systems for analysis. By using IoT devices, companies can monitor their products and processes in real-time and gain insights into how they are performing.

The Impact of Integrating Cognitive Digital Twin Technology With IoT?

Cognitive digital twin technology can be integrated with IoT by using data from IoT devices to create a digital twin model of a physical system. IoT devices can provide data about the performance of a product or process, which can be used to create a digital twin model.

The digital twin model can then be used to simulate the behavior of the physical system under different conditions and to predict how the system will behave in the future. By using IoT data to create a digital twin model, companies can gain insights into the performance of their products and processes, and they can optimize their operations to reduce costs and improve efficiency.

There are several benefits to integrating cognitive digital twin technology with IoT, including:

  1. Predictive Maintenance: By using a cognitive digital twin model, companies can predict when maintenance is required on their products or processes, reducing downtime and maintenance costs.
  2. Improved Efficiency: By monitoring the performance of their products and processes in real-time, companies can optimize their operations to improve efficiency and reduce costs.
  3. Reduced Waste: With CDT, companies can reduce waste by identifying areas where resources are being wasted.
  4. Enhanced Product Design: By using a cognitive digital twin model, companies can simulate the behavior of their products under different conditions and make design changes in the earlier stages of R&D to improve performance, reduce costs, and cut time from POC to market.
  5. Improved Customer Experience: By monitoring the performance of their products in real-time, companies can improve the customer experience by identifying and addressing issues before they become major problems.

How the Market is Already Benefiting from Digital Twin and IoT Technologies

Many industries are already benefiting from the kinds of integration between CDT and IoT technologies. Chief among these is the transportation industry.

Cognitive digital twin technologies coupled with IoT are already proving invaluable for predictive maintenance of high-value military vehicles, airplanes, ships, and even passenger cars. For example, digital twin solutions like those developed by CarTwin extend the lifespan of cars and other vehicles by monitoring the vehicle’s “health” through its “digital twin.”

Basically, CarTwin can provide diagnostic and predictive models for all vehicle systems for which data is available (either directly or indirectly) onboard the vehicle.

Virtually any part of the vehicle that has sensors or that sensors can be developed for can be “twinned.” These data sets are then enhanced and augmented with design and manufacturing data that is already available by the OEM.

Primarily designed for use in fleets of vehicles, in combination with powerful AI models, CarTwin predicts breakdowns, monitors and improves performance, and measures and records real-time greenhouse gas emissions, which reduces expensive maintenance costs and avoids lost revenue associated with fleet downtime.

You can read much more about how AI and digital twin technology in my new book Quantum Care: A Deep Dive into AI for Health Delivery and Research. While the book’s primary focus is on healthcare delivery, it also takes a deep dive into digital twin tech, with an entire chapter devoted to CDT, as well as IoT, and the development and launch of CarTwin!

Rohit Mahajan is a Managing Partner at BigRio and the President and Co-Founder of Citadel Discovery. He has a particular expertise in the development and design of innovative AI and machine learning solutions for clients in Healthcare, Financial Services, Retail, Automotive, Manufacturing, and other industry segments.

CarTwin has leveraged AI and Digital Twin technologies to create a digital, cloud-based clone of a physical vehicle designed to detect, prevent, predict, and optimize through AI and real-time analytics. If you would like to benefit from our expertise in these areas or if you have further questions on the content of this article, please do not hesitate to contact us.

Search engine technology startup You.com has announced the launch of YouChat 2.0, a groundbreaking new “multimodal conversational AI” system that the company says will take the internet search experience to “a whole new level,” leaving search engine giant Google wondering what happened!

The company says that YouChat 2.0 is the first web search that combines advanced conversational AI with community-built apps, offering a unique and interactive experience with each query. Its blended large language model, known as C-A-L (Chat, Apps, and Links), serves up charts, images, videos, tables, graphs, text, or code embedded in its responses to user queries. The idea is fewer open tabs and less drifting away from your search engine.

“We already have 150+ apps in our main search results page, but we’re now pulling more into the chat world and into this chat interface, which is non-trivial because you basically have to allow the chat model to decide when is best to answer a question by just showing you the facts,” said You.com co-founder and CEO Richard Socher in an interview with VentureBeat. “No one has done this yet — we’re the first to launch it to the world publicly.”

Socher knows a thing or two about groundbreaking advances in natural language processing (NLP), the technology that underpins all popular search engines. According to Google Scholar, Socher is currently the fourth most cited researcher in the field. As former chief scientist (and EVP) at Salesforce and former adjunct professor at Stanford’s computer science department, Socher has built his career on novel NLP applications. He now thinks it’s time to reimagine the way we interact with traditional search engines.

“Google has done incredible research and propelled the field forward in many dimensions in terms of its research, but they are making $150 billion a year by invading your privacy and showing you ads on a search results page,” Socher said, appearing visibly frustrated with the current state of affairs. “So when you develop technology, that would work great if, instead of six ads, you had one ad, but [if] it means you lose $500 million a day, you’re probably not incentivized to launch that new technology into the world.”

“It’s the classic innovator’s dilemma,” he added. “Google is very good at running things not profitably for a long time, like Maps and YouTube, and then pulling ads into it. The problem is this particular technology changes their core product.”

The company invites developers to submit apps to its open platform and work together to create the ultimate chat-search-do engine. Developers can learn more at You.com’s developers page.

How BigRio Helps Bring Advanced AI Solutions to the Internet Experience

The announcement of YouChat 2.0 by You.com is yet another example of how innovative startups are advancing the ubiquity of AI and NLP. AI algorithms are already “running behind the scenes” of most of our internet and social media experiences, and soon they will be little or no separation between Internet of Things applications, chat, and AI, with one very much totally reliant on and integrated with the other.

BigRio prides itself on being a facilitator and incubator for such advances in leveraging AI to improve the digital world.

In fact, we like to think of ourselves as a “Shark Tank for AI.”

If you are familiar with the TV series, then you know that, basically, what they do is hyper-accelerate the most important part of the incubation process – visibility. You can’t get better visibility than getting out in front of celebrity investors and a TV audience of millions of viewers. Many entrepreneurs who have appeared on that program – even those who did not get picked up by the Sharks – succeeded because others who were interested in their concepts saw them on the show.

At BigRio, we may not have a TV audience, but we can do the same. We have the expertise to not only weed out the companies that are not ready for the market, as the sharks on the TV show do but also mentor and get those that we feel are readily noticed by the right people in the AI investment community.

You can read much more about how AI is redefining the Internet of Things in my new book Quantum Care: A Deep Dive into AI for Health Delivery and Research. While the book’s primary focus is on healthcare delivery, it also takes a deep dive into AI in general, with specific chapters on IoT and NLP technologies.

Rohit Mahajan is a Managing Partner with BigRio. He has a particular expertise in the development and design of innovative solutions for clients in Healthcare, Financial Services, Retail, Automotive, Manufacturing, and other industry segments.

BigRio is a technology consulting firm empowering data to drive innovation, and advanced AI. We specialize in cutting-edge Big Data, Machine Learning, and Custom Software strategy, analysis, architecture, and implementation solutions. If you would like to benefit from our expertise in these areas or if you have further questions on the content of this article, please do not hesitate to contact us.

When I was in graduate school, I designed a construction site of the future. It was in collaboration with Texas Instruments in the late 90s. The big innovation, at the time, was RFID (radio-frequency identification). Not that RFID was new. In fact, it has been around since World War II where it was used to identify allied planes. After the war, it made its way into industry through anti-theft applications. In the 80s, a group of scientists from Los Alamos National Laboratory formed a company using RFID for toll payment systems (still in use today). A separate group of scientists there also created a system for tracking medication management in livestock. From here it made its way into multiple other applications and began to proliferate.

RFID got a boost in 1999 when two MIT professors, David Brock and Sanjay Sarma, reversed the trend of adding more memory and more functionality to the tags and stripped them down to a low-cost, very simple microchip. The data gleaned from the chip was stored in a database and was accessible via the web. This was right at the time that the wireless web emerged (good old CDPD) as well, which really bolstered widespread adoption. This also precipitated funding from large companies, like Procter & Gamble and Gillette (this was before P&G acquired Gillette), to institute the Auto-ID Center at MIT, which furthered the creation of standards and cemented RFID as an invaluable weapon for companies, especially those with complex supply chains.

OK, as you can tell, RFID has a special place in my heart. I even patented the idea of marrying RFID with images, but that is another story. Anyway, up to this point you’ve probably decided this is a post about RFID, but it’s not. It’s a post about RFID to IoT (Internet of Things). The term Internet of Things (IoT) was first coined by British entrepreneur Kevin Ashton in 1999 while working at Auto-ID Labs, specifically referring to a global network of objects connected by RFID. But RFID is just one type of sensor and there are numerous sensors out there. I like this definition from Wikipedia:

In the broadest definition, a sensor is an electronic component, module, or subsystem whose purpose is to detect events or changes in its environment and send the information to other electronics, frequently a computer processor. A sensor is always used with other electronics, whether as simple as a light or as complex as a computer.

Sensors have been around for quite some time in various forms. The first thermostat came to market in 1883, and many consider this the first modern, manmade sensor. Infrared sensors have been around since the late 1940s, even though they’ve really only recently entered the popular nomenclature. Motion detectors have been in use for a number of years as well. Originally invented by Heinrich Hertz in the late 1800s, they were advanced in World War II in the form of radar technology. There are numerous other sensors: biotech, chemical, natural (e.g. heat and pressure), sonar, infrared, microwave, and silicon sensors to name a few.

According to Gartner, there are currently 8 Billion IoT Units worldwide and there will be 20 Billion by 2020. Suffice to say there are numerous sources of data to track “things” within an organization and throughout supply chains. There are also numerous complexities to managing all of these sensors, the data they generate, and the actionable intelligence that is extracted and needs to be acted on. Some major obstacles are networks with time delays, switching topologies, density of units in a bounded region, and metadata management (especially across trading partners and customers). These are all challenges we at BigR.io have helped customers work through and resolve. A great example is our Predictive Maintenance offering.

Let’s get back to RFID to IoT. There is a tight coupling because the IP address of the unit needs to be supplemented with other information about the thing (for example, condition, context, location, security, etc). RFID and other sensors working in unison can provide this supplemental information. This marriage enables advanced analytics including the ability to make predictions. Large sensor networks must be properly architected to enable effective sensor fusion. Machine Learning helps take IoT to the next level of sophistication for predictions and automation for fixes and can help figure out when and where every ”thing” fits in the ecosystem that they play in. A proper IoT agent should monitor the health of the systems individually and in relation to other parts. Consensus filters will help in the analysis of the convergence, noise propagation reduction, and ability to track fast signals.

There are other factors that play into why IoT is so hot right now: the whole Big Data phenomenon has lent itself to the growth, endless compute power has served as a foundation by which advanced applications using IoT can run, and the Machine Learning libraries have been democratized by companies like Google, Facebook, and Microsoft. In general, Machine Learning thrives when mounds of data are available. However, storing all data is cost prohibitive and there is so much data being generated that most companies opt to only store bits of critical data. Some companies only store the data to freeze it from failures. You may not want to store all data, but you don’t want to lose “metadata,” or the key information that the data is trying to tell you, whether from the sensor itself or indirectly through neighboring sensors. I had a stint where we supported Federal and Defense-related sensor fusion initiatives and I picked up a handy classification of data:

  • Data
  • Information
  • Knowledge
  • Intelligence

The flow is moving the metadata being generated down the line into information → knowledge → intelligence that can be acted upon.

There also exists the ABCs of Data Context:

[A]pplication Context: Describes how raw bits are interpreted for use.

[B]ehavioral Context: Information about how data was created and used by real people or systems.

[C]hange Over Time: The version history of the other two forms of data context.

Data context plays a major role in harnessing the power of an IoT network. As we progress to smarter networks, more sophisticated sensors, and artificial intelligence that manages our “things,” the architecture of your infrastructure (enterprise data hub), the cultivation and management of your data flows, and the analytics automation that rides on top of everything become critical for day-to-day operations. The good news is that if this is all done properly, you will reap the rewards of thing harmony (coined here first folks).

Please visit our Deep Learning Neural Networks for IoT white paper for a more technical slant.