×

Download Our Latest Case Study

Explore how we helped the global wellness pioneer in the real estate sector to improve brand visibility and occupant well-being!!!

Name
Contact information

PromptCloud Inc, 16192 Coastal Highway, Lewes De 19958, Delaware USA 19958

We are available 24/ 7. Call Now. marketing@promptcloud.com
Avatar

The round-up post for December 2016 covered McKinsey’s report on data science, Deepfield’s Acquisition by Nokia and AWS Managed Services. In this post, we’ll cover 2017 January’s latest news and events from the data science, AI and cloud computing field.

Intel Open-sources BigDL

Chip giant Intel has open-sourced BigDL, a deep learning framework that runs as a Spark job on top of Hadoop. It has been designed to leverage pre-built Spark clusters to run deep learning algorithms and makes it easy to load big datasets. The BigDL framework has support for Spark versions 1.5, 1.6 and 2.0 and provides an option to embed deep learning in existing Spark programs.

Although modelling of the framework is based on Torch, tests show significant performance improvement in comparison to Torch when running on Xeon servers. It also performs better than other open-source frameworks like Caffe and TensorFlow. Note that Intel’s approach is different when it comes to BigDL (considering other frameworks leverage GPUs)–it uses Intel’s Math Kernel Library (MKL) to execute the workload as a multi-threaded Spark job and leverage the multi-threading extensions of Intel’s Xeon processors. 

This results in faster execution of the deep learning workloads.

Intel aims to capture maximum market share in the big data space by going against GPUs and rolling out BigDL as an open-source library is definitely a part of this goal.

IBM’s PowerAI Now Supports TensorFlow

IBM has added support for Google’s TensorFlow– a deep-leaning framework that spans across CPUs and GPUs. This has been done to help IBM position PowerAI as the ideal hardware blend to get the most out of TensorFlow. PowerAI leverages IBM’s Power8 processor with the Nvidia Tesla Pascal P100 GPU (a next-generation machine learning server). 

As the instruction set for this server doesn’t be custom-made, ML applications need to be written to execute on that specific instruction set. Note that most of the cloud-based machine learning platforms don’t support Pascal hardware.

IBM is planning to offer a wholesome combination of CPUs, GPUs, and software, all created to complement each other. They’re also going with a niche focus for PowerAI in comparison to some of the competitors. Primarily, it’ll be deployed to train machine learning models from raw data. 

This particular phase of any machine learning project is computation-intensive, and it specifically applies to TensorFlow-powered deep learning applications. Hence, IBM wants to offer high-end CPUs and GPUs, and design hardware that will expedite the entire project. It’d be interesting to see how the Power9 processors (to be launched this year) will fulfil IBM’s PowerAI ambitions.

Google’s AI Codes AI Apps

Google Brain AI Research

The Google Brain AI research lab is building a new AI system that can design and has created a new machine learning software that can build AI software. The goal is to make future AI development cheaper and easier. Google’s AI team conducted an experiment in which software designed a machine-learning system to take a test used to benchmark software that processes language. The end result was superior to the software written by humans.

According to Jeff Dean, the leader of Google Brain research group, their focus is on “automated machine-learning” in order to accelerate the pace of the application of machine learning in various industries. This way some part of the job done by machine learning experts would be replaced by AI software.

The researchers assigned their software to create learning systems for collections of multiple different, but related, problems, such as navigating mazes. It crafted designs that showed an ability to generalize and pick up new tasks with less additional training than would be usual. Researchers stated that they used 800 high-powered graphics processors to power software that came up with designs for image recognition systems that rivalled the best designed by humans.

Don’t forget to share your views in the comments section if you found these advancements in AI exciting.

 

Sharing is caring!

Recent post

SEO Data Analytics
Can SEO Data Analytics make Data Engineering
  • August 26, 2022
import.io Competitors and Alternatives
Top 10 import.io Competitors and Alternatives
  • August 18, 2022
Zyte Competitors and Alternatives
Top 10 Zyte Competitors and Alternatives
  • August 18, 2022
ScrapeHero Competitors and Alternatives
Top 10 ScrapeHero Competitors and Alternatives
  • August 18, 2022
Webscraper.io Competitors and Alternatives
Top 10 Webscraper.io Competitors and Alternatives
  • August 12, 2022
OctoParse Competitors and Alternatives
Top 10 Octoparse Competitors and Alternatives
  • August 10, 2022
Click on Contact Us below to Get started with your Project Requirements

Are you looking for a custom data extraction service?

Contact Us