Research Hub > dataanalytics > BTEX 2020: How to Manage and Consolidate your AI Workloads

November 05, 2020

Article
4 min

BTEX 2020: How to Manage and Consolidate your AI Workloads

Learn about software and hardware to run your AI workloads in this Dell Technologies presentation from CDW's BTEX 2020 virtual event.

BTEX 2020: How to Manage and Consolidate your AI Workloads

AI is everywhere, says Thomas Henson, BusinessDevelopment Manager, AI/Analytics at Dell Technologies, speaking at CDW's BTEX2020 virtual event. It's in the cars that we drive, in healthcare imaging andin any of our smart home devices. But where does this technology fit and whatcan we do to usher in that wave?

Henson's presentation highlighted the next generationof voice technology. We're starting to see this moving from home automation towhere we're actually seeing it in the enterprise, highlighting call centres asa business that's relying more on AI. One of the ways we're starting to seethat being automated is with this next generation of voice. It's so good nowthat it's hard to derive whether you're talking to a machine or to a human.

3 key pointsin the history of machine communications

1950: Alan Turing develops the Turing Test to answer the question Can machines think?

1966: ELIZA, an early chatbot from MIT, demonstrates basic communication between humans and machines.

2010s: The emergence of voice assistants like Siri and Alexa.

People have been going at this since 1950, but in thelast 10 years, we've really seen some good input, says Henson.

The keypurpose of AI projects how to extract value from data

At Dell Technologies, we have a data-first approach,says Henson. We want to capture the data in a way that can be accessed by ourtraditional assets and can improve our traditional assets.

If you have a data-first approach, you can fuel,innovate and make better your assets within human capital, intellectualproperty and infrastructure. Understand what data elements you have, so you canmake good decisions.

Theinfrastructure costs of artificial intelligence

Business intelligence is typically measured inmegabytes, gigabytes and terabytes. From a performance perspective, CPUsgenerally range in the hundreds, but can be much higher for large enterprisedata warehouses.

When it comes to analytics and machine learning, i.e.anomaly detection, recommendation engines, predicting outcomes, typically thisis semi-structured data, so you would see it measured in terabytes topetabytes, and typically in the hundreds and even thousands of CPUs.

When you get into deep learning, i.e. voice-to-text,image data, video data, this is fully unstructured data that is typicallymeasured in petabytes to even exabytes. This is where it takes thousands ofCPUs, or even GPUs, of performance to analyze and train those models using deeplearning.

What we're trying to do here is solve a businessproblem, says Henson. We want to map that back to the business, find theright data elements and then use the right tools to build a solution. Thechallenge is how do we go from business problem to proof of concept (POC) andhow do we scale that into production?

Eliminate AIbottlenecks with Dell PowerScale

Our PowerScale platform gives you the ability tobuild out a consolidated data lake, where you can build pipelines that aregoing to help you train your models faster, says Henson. The faster you canget to the answer you're looking for, it's going to solve that businessproblem.

This can improve your data team's productivity andmaximize return on investment. Whether you're starting out from POC or scalinginto production, your data can stay in place, and you can just bring computefor your deep-learning models into that environment.

Dell's AIstarter bundle

Henson recommends a Dell solution he calls the AIstarter bundle. This includes the Dell Precision Data Science Workstation,along with PowerScale. This gives you the compute, the networking and thatdata layer within PowerScale to be able to start off with 5 TB of data, but asthat data starts to move, you've got the solution that gives you that backboneand best practices for data consolidation.

If you bring in multiple Dell Precision Data ScienceWorkstations, you can have multiple different modellers or data scientistsbuilding off that one PowerScale. Sharing the data between those and not havingto swap data or try to move things around means that more data scientists cantrain those models. We're bringing that compute to the data.

As you move from POC into production, you're knowingthat you have the backbone and can integrate with other models of PowerScaleand Isilon to build out your data lake, scale your modellers and scale yourcompute as well. You can go from POC to production seamlessly.

To learn moreabout Dell solutions, please visit CDW.ca/Dell. And besure to bookmark this page for more coverage of BTEX 2020.