It's exceptionally rare that technology advances happen in isolation. A good idea is rarely had by only one person or one company. We are seeing one such advance right now: machine learning for IT.
Machine learning is the next natural evolution of IT analytics. It takes the insight already being derived from IT data sets like log files, code data, and digital interactions, and makes it both interactive and predictive.
Like IT analytics, not all machine learning is created equal. There's a lot of hype out there, not to mention a bit of hysteria. Let me set the record straight. Just because a marketer calls it "Artificial Intelligence" doesn't mean it is. Machine learning isn't going to replace you. It's going to make you better and smarter in your job, and your organization will be more successful for it.
AI is still hype. IA is reality.
Alexa. Siri. Google. Smart digital assistants are rapidly integrating into everything from the way we drive our cars to the way we shop to how we listen to music. The familiar term for this is "Artificial Intelligence" or "AI," and with its use come references to Skynet and I Robot. But today's "AI" technologies are a far cry from its depictions in Hollywood.
In reality, machine learning today is a form of intelligence augmentation, or "IA." Alexa isn't able to look in the refrigerator, see that the milk is nearly empty, and then go purchase more. But she can and does make it easier for humans to remember to buy milk by adding it to a shopping list on command.
IA Supports Human Innovation
Humans have always used tools and technology to augment their work. It is what sets us apart from almost every other species on this planet. Few others use tools, and those that do have never come close to matching our relentless pursuit of innovation. With technology, we extend ourselves further and faster than we ever could alone. But even as technology becomes increasingly sophisticated – able to take on more human tasks with greater independence – it continues to rely on our extraordinary minds to solve tomorrow's challenges. If we simply stopped inventing -- stopped pursuing -- technology too would be at a standstill.
IA Advances Careers
It is highly likely that machine learning will reduce the need for manual labor, including in IT. But this does not make it a job killer. On the contrary, as basic tasks are taken over by machines, it paves the way for safer jobs that better leverage humans' capacity for creativity and critical thinking to drive continuous improvement.
This transition is imperfect. The accelerating pace of innovation has outstripped the ways we prepare people for the workforce. It's incumbent upon us to catch up in order to preserve our social and economic security.
In IT, not all IA is created equal
In a world where every vendor uses the same terms to describe their solution, it can be difficult to parse the differences between products. When it comes to evaluating machine learning, there are two key considerations.
- The Learning. Many machine learning technologies are math-based, and rely solely on algorithms to process and interpret data. Truly "intelligent" machine learning also leverages visual processing for multi-dimensional data analysis, and heuristics deliver insight at the speed of modern business. The best machine learning also incorporates human intelligence.
- The Data Source. Machine learning technologies are only as intelligent as their data source. If the data is limited in scope or subject to manipulation, then no matter how strong the algorithms or heuristics are, their outputs are also limited and suspect. High-fidelity, real-time data isn't required for machine learning, but it transforms the quality and reliability of the insight.
ExtraHop Addy is the first SaaS offering that observes and analyzes all digital interactions and applies machine learning to detect anomalies in real-time. Using data from the ExtraHop platform, Addy builds continuous baselines for every device, network, and application, and then proactively detects and surfaces potential issues in the environment. The core algorithm and heuristics also incorporate feedback from in-house and crowd-sourced domain expertise to reduce the number of false positives and keep IT teams focused on the most critical issues.
So what sets Addy apart?
- Our data is unbiased and comprehensive. Our data set isn't system self-reported like log files, and it doesn't rely on code instrumentation. Our data set is digital communications -- all digital interactions that occur between all systems within the infrastructure. We don't just look at machines or applications. We see -- and analyze -- every single element that makes up the digital experience or supports the digital business. That means better performance, higher availability, and greater security.
- Our machine-learning is more than math. ExtraHop Addy relies on a combination of algorithms, heuristics, and visual processing, seamlessly incorporating both statistical and visual learning. IT also layers in feedback from human domain experts for continuous improvement.
- Our insight is real time. ExtraHop doesn't perform post-hoc analysis on data written to disk. All of our analytics, including Addy's machine learning, are performed on the data in flight. This means that there's zero latency between what's happening in your environment and when you know about it.
And this is just the beginning. With our rich dataset and proprietary machine learning, there are applications in security, industry benchmarking, and automation, to name just a few.
Today, the world centers on digital experience, and it must be seamless, secure, and ubiquitous. In enterprise IT, which is tasked with supporting a seamless digital experience for everything from online shopping to patient care to shipping logistics, the promise of IA is even more powerful.
To learn more about the power of Addy, read the press release.
This is a companion discussion topic for the original entry at https://www.extrahop.com/company/blog/2017/ai-comes-to-it/