MinBFT, Hyperledger Lab - Open Source Project to Develop Efficient Consensus Protocol
February 29, 2020
Blockchain technologies are attracting a lot of attention from various communities. A consensus protocol is one of key parts; it is a process to make an agreement on a value among distributed nodes. I …
Kubernetes Native Infrastructure and CoreOS Operator Framework for 5G Edge Cloud Computing
February 27, 2020
Launched in 2018, several blueprints are work in progress such as KNI(Kubernetes Native Infrastructure for Edge) project for 5G edge computing, in Akraino Edge Stack.Service Providers started to explo …
Dont beat the market; beat the bots: Adversarial networks in finance
February 24, 2020
Automated investing has brought an immense amount of stability to the market, but it has also brought predictability. Garrett Lander and Al Kari examine if an adversarial network can game the behavior of automated investors by learning the patterns in market activity to which they are most vulnerable.
Deploying machine learning models on the edge
February 20, 2020
When IoT meets AI, a new round of innovations begins. Yan Zhang and Mathew Salvaris examine the methodology, practice, and tools around deploying machine learning models on the edge. They offer a step-by-step guide to creating an ML model using Python, packaging it in a Docker container, and deploying it as a local service on an edge device as well as deployment on GPU-enabled edge devices.
Executive Briefing: Unpacking AutoML
February 19, 2020
Paco Nathan outlines the history and landscape for vendors, open source projects, and research efforts related to AutoML. Starting from the perspective of an AI expert practitioner who speaks business fluently, Paco unpacks the ground truth of AutoMLtranslating from the hype into business concerns and practices in a vendor-neutral way.
Executive Briefing: Will you learn Chinese to advance in AI?
February 19, 2020
According to research by AI2, China is poised to overtake the US in the most-cited 1% of AI research papers by 2025. The view that China is a copycat but not an innovator may no longer be true. Charlotte Han explores what the implications of China's government funding, culture, and access to massive data pools mean to AI development and how the world could benefit from such advancement.
Unlocking data capital with AI (sponsored by Dell)
February 17, 2020
As we look toward more demanding applications of artificial intelligence to unlock value from data, it's increasingly essential to develop a sustainable big data strategy and to efficiently scale artificial intelligence initiatives. Arash Ghazanfari covers the fundamental principles that need to be considered in order to achieve this goal.
Assumed risk versus actual risk: The new world of behavior-based risk modeling
February 16, 2020
Viridiana Lourdes explains how banks and financial enterprises can adopt and integrate actual risk models with existing systems to enhance the performance and operational efficiency of the financial crimes organization. Join in to learn how actual risk models can reduce segmentation noise, utilize unlabeled transactional data, and spot unusual behavior more effectively.
Building an AI platform: Key principles and lessons learned
February 16, 2020
Moty Fania details Intels IT experience of implementing a sales AI platform. This platform is based on streaming, microservices architecture with a message bus backbone. It was designed for real-time data extraction and reasoning and handles the processing of millions of website pages and is capable of sifting through millions of tweets per day.
Cisco Data Intelligence Platform (sponsored by Cisco)
February 16, 2020
Siva Sivakumar explains the Cisco Data Intelligence Platform (CDIP), which is a cloud-scale architecture that brings together big data, AI and compute farm, and storage tiers to work together as a single entity, while also being able to scale independently to address the IT issues in the modern data center.
Deep learning on Apache Spark at CERNs Large Hadron Collider with Analytics Zoo
February 15, 2020
Sajan Govindan outlines CERNs research on deep learning in high energy physics experiments as an alternative to customized rule-based methods with an example of topology classification to improve real-time event selection at the Large Hadron Collider. CERN uses deep learning pipelines on Apache Spark using BigDL and Analytics Zoo open source software on Intel Xeon-based clusters.
Executive Briefing: What it takes to use machine learning in fast data pipelines
February 14, 2020
Dean Wampler dives into how (and why) to integrate ML into production streaming data pipelines and to serve results quickly; how to bridge data science and production environments with different tools, techniques, and requirements; how to build reliable and scalable long-running services; and how to update ML models without downtime.
From whiteboard to production: A demand forecasting system for an online grocery shop
February 13, 2020
Data-driven software is revolutionizing the world and enable intelligent services we interact with daily. Robert Pesch and Robin Senge outline the development process, statistical modeling, data-driven decision making, and components needed for productionizing a fully automated and highly scalable demand forecasting system for an online grocery shop for a billion-dollar retail group in Europe.
Staying safe in the AI era
February 10, 2020
Machine learning and artificial intelligence are no longer science fiction, so now you have to address what it takes to harness their potential effectively, responsibly, and reliably. Based on lessons learned at Google, Cassie Kozyrkov offers actionable advice to help you find opportunities to take advantage of machine learning, navigate the AI era, and stay safe as you innovate.
Unleash the power of data at scale (sponsored by Intel)
February 8, 2020
Data analytics is the long-standing but constantly evolving science that companies leverage for insight, innovation, and competitive advantage. Jeremy Rader explores Intels end-to-end data pipeline software strategy designed and optimized for a modern and flexible data-centric infrastructure that allows for the easy deployment of unified advanced analytics and AI solutions at scale.