CEO and co-founder of DataVisor, a leading fraud detection company with solutions powered by transformational AI technology.
Today’s predominantly cloud-based information technology architecture paradigm was built on the premise that devices are “dumb” and that servers are “smart”—and that, therefore, the latter should do all the heavy lifting. The conditions that gave rise to such design are shifting, prompting us to revisit the status quo across business disciplines, especially in fraud and risk management.
How Did We Get To The Cloud-Centric World?
The widespread adoption of personal computers meant a default to a predominantly decentralized architecture where end users owned and maintained the machines that stored and processed data. PCs were “smart” end devices interconnected by a relatively weak infrastructure and, thus, performed all of the thinking. Soon enough, though, phone lines gave way to broadband connections and fiber optics.
Enter the cloud revolution. Most digital services today rely on third-party computation centers (we may call it “the cloud,” but it’s really just servers owned by someone else) that handle all the heavy lifting, and our personal devices mainly act as a means of communication with those centers.
Shortcomings Of The Cloud Paradigm
The cloud democratized access to computationally-intensive services, which was truly beneficial in many regards. The field of artificial intelligence, for one, benefits immensely from global access to the activities and data for analytics and machine learning purposes.
However, the cloud paradigm also has certain limitations that invite us to explore a new form of decentralized architecture:
1. Companies everywhere face stringent privacy requirements in connection with the transfer, processing and storage of users’ data. Solutions that store and process sensitive data locally limit exposure to adverse regulatory and reputational consequences.
2. The usability and amount of data generated by modern devices far outpaced the evolution of mobile internet infrastructure. As a result, cloud services’ users experience delays caused by the time it takes for data to travel from devices to servers and back. The term for this is latency, a major area of concern for many services.
The Case For Edge Computing
To overcome these challenges, innovators can tap into the power of device computing (edge computing). If smarter devices can process more data, why not use them to minimize exposure to privacy issues, communication costs and latency?
Here are two examples that illustrate the impact of this shift:
1. Device Computing In Self-Driving Cars
These incredibly advanced machines depend on an overwhelming amount of data generated by light, radar, sound, temperature and speed sensors. Life-and-death decisions depend on this information and must be taken in milliseconds.
Even the most advanced mobile networks can’t guarantee that all this data will always be transmitted to the cloud and back in time for the wheels to turn or the brakes to be applied to prevent a collision. Instead, driverless cars are equipped with advanced computer devices that perform these calculations on the spot.
2. Device Computing In Fraud Prevention
The devices we use to interact with e-commerce and financial services firms get smarter by the minute. Fraud is on the rise across digital channels, but fortunately, smarter devices generate (and can process) rich information that companies can use to detect attacks.
This includes device information (e.g., about the hardware, operating system, network and language settings) and info about users’ behavior (e.g., time spent in forms and action sequences).
In cloud-only systems, information is often sacrificed to minimize latency. For example, online payment processors are hesitant to introduce several seconds’ delay to gather device and behavioral data, send it to the cloud and wait for a response that might not come due to a bad network connection before authorizing a transaction.
To avoid compromising this, avant-garde companies are tapping into cloud computing’s power to derive intelligence decision logic through big data analysis and machine learning and complementing it by “pushing” such derived decision logic to devices to detect and stop fraudulent transactions at the edge.
Device computing is a particular game-changer in reducing server workloads when repelling malicious attacks launched with automatic scripts or bots, where servers run overtime to cope with extensive traffic volume at the peak of large-scale bot attacks.
Unfortunately, bot attacks are just too common. According to The Atlantic, more than 94% of 100,000 domains surveyed in a report experienced at least one automated attack in the previous 90-day period.
Instead of solely relying on expensive content delivery network solutions, adding edge/device computing brings a new direction toward thwarting such attacks effectively at their origins.
Preparing Your Organization For The Edge Computing Future
The shift toward device computing makes even more sense when we pause to consider how virtually all domains of knowledge are being transformed through decentralization. The concerns that motivate a move from the cloud can find their parallels in those that gave rise to decentralized finance as a whole—and cryptocurrencies in particular. Similar reflections could be drawn from decentralized knowledge sources (e.g., Wikipedia), media production and distribution (e.g., Youtube and Spotify) and many more.
How can you start preparing for this future? Take the following first steps:
1. Start internal conversations across teams by taking stock of the cloud systems each functional area uses, if there are any latency issues associated with them and how much they cost to run.
2. Think about possible alternatives, even if they feel futuristic—this is how all great innovation starts, after all. Involve IT and other technical areas in your organization and ask them to think big.
3. Ask around. Your peers in different industries or even friendly competitors may have already solved the issues you encountered and might be willing to share their experiences in the transition away from a cloud-only paradigm.
While the benefits of device computing are clear in many use cases, it is also true that cloud computing still has a lot to offer, including within the realm of advanced machine learning, data-driven approaches. In all likelihood, we will witness a new paradigm where the cloud and device computing will be amalgamated, bringing the best of both worlds together.
Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?