January 1, 2020

340 words 2 mins read

Key big data architectural considerations for deploying in the cloud and on-premises (sponsored by NetApp)

Key big data architectural considerations for deploying in the cloud and on-premises (sponsored by NetApp)

When analytics applications become business critical, balancing cost with SLAs for performance, backup, dev, test, and recovery is difficult. Karthikeyan Nagalingam discusses big data architectural challenges and how to address them and explains how to create a cost-optimized solution for the rapid deployment of business-critical applications that meet corporate SLAs today and into the future.

Talk Title Key big data architectural considerations for deploying in the cloud and on-premises (sponsored by NetApp)
Speakers karthikeyan nagalingam (NetApp)
Conference Strata Data Conference
Conf Tag Make Data Work
Location New York, New York
Date September 26-28, 2017
URL Talk Page
Slides Talk Slides
Video

Architects have the choice of designing an analytics platform in the data center or public cloud infrastructure or leveraging a public cloud service such as Azure HDInsight or EMR. Many times, the location where data is generated determines the location of analytics services, even though it may not be the most cost-effective, flexible, or performant solution. Another challenge is that when the analytics applications become business critical, SLAs for application performance, backup, and disaster recovery for these solutions can be challenging to meet at very large (petabyte) scale. Additionally, some analytics services may only be available in certain clouds while the rest of the analytics happen in other clouds or in the data center—to deploy a comprehensive, cohesive solution, the data needs to be able to flow between multiple clouds and your data center. Finally, as more applications are built on top of Hadoop data, creating dev and test environments and validation using production data may also pose a problem. To make matters worse, data can be present in different formats, such as files or objects, and moving this data into HDFS format requires a data transformation effort. These challenges make big data deployments long and expensive. Karthikeyan Nagalingam discusses big data architectural challenges and how to address them and explains how to create a cost-optimized solution for the rapid deployment of business-critical applications that meet corporate SLAs today and into the future. This session is sponsored by NetApp.

comments powered by Disqus