Is 2015 the Year of Deep Learning? by Akram Hussain | Syncfusion Blogs
Live Chat Icon For mobile
Live Chat Icon
Popular Categories.NET  (179).NET Core  (28).NET MAUI  (215)Angular  (112)ASP.NET  (51)ASP.NET Core  (83)ASP.NET MVC  (90)Azure  (41)Black Friday Deal  (1)Blazor  (228)BoldSign  (15)DocIO  (24)Essential JS 2  (109)Essential Studio  (200)File Formats  (69)Flutter  (133)JavaScript  (226)Microsoft  (120)PDF  (82)Python  (1)React  (105)Streamlit  (1)Succinctly series  (131)Syncfusion  (948)TypeScript  (33)Uno Platform  (3)UWP  (3)Vue  (45)Webinar  (52)Windows Forms  (61)WinUI  (71)WPF  (163)Xamarin  (160)XlsIO  (38)Other CategoriesBarcode  (5)BI  (29)Bold BI  (8)Bold Reports  (2)Build conference  (9)Business intelligence  (55)Button  (4)C#  (158)Chart  (141)Chart of the week  (55)Cloud  (15)Company  (443)Dashboard  (8)Data Science  (3)Data Validation  (8)DataGrid  (70)Development  (662)Doc  (8)DockingManager  (1)eBook  (99)Enterprise  (22)Entity Framework  (7)Essential Tools  (14)Excel  (43)Extensions  (23)File Manager  (7)Gantt  (19)Gauge  (12)Git  (5)Grid  (31)HTML  (13)Installer  (2)Knockout  (2)Language  (1)LINQPad  (1)Linux  (2)M-Commerce  (1)Metro Studio  (11)Mobile  (513)Mobile MVC  (9)OLAP server  (1)Open source  (1)Orubase  (12)Partners  (21)PDF viewer  (43)Performance  (13)PHP  (2)PivotGrid  (5)Predictive Analytics  (6)Report Server  (3)Reporting  (10)Reporting / Back Office  (11)Rich Text Editor  (12)Road Map  (12)Scheduler  (54)Security  (4)SfDataGrid  (9)Silverlight  (21)Sneak Peek  (31)Solution Services  (4)Spreadsheet  (11)SQL  (14)Stock Chart  (1)Surface  (4)Tablets  (5)Theme  (12)Tips and Tricks  (113)UI  (393)Uncategorized  (68)Unix  (2)User interface  (68)Visual State Manager  (2)Visual Studio  (31)Visual Studio Code  (19)Web  (619)What's new  (334)Windows 8  (19)Windows App  (2)Windows Phone  (15)Windows Phone 7  (9)WinRT  (26)

Is 2015 the Year of Deep Learning? by Akram Hussain

This blog originally appeared on Packt Publishing’s Big Data and Business Intelligence blog.

The new phenomenon to hit the world of big data seems to be deep learning. I’ve read many articles and papers where people question whether there’s a future for it, or if it’s just a buzzword that will die out like many a term before it. Likewise I have seen people who are genuinely excited and truly believe deep learning is the future of artificial intelligence—the one solution that can greatly improve the accuracy of our data and development of systems.

Deep learning is currently a very active research area. By no means is it established as an industry standard, but rather one which is picking up pace and brings a strong promise of being a game changer when dealing with raw, unstructured data.

So what is deep learning?

Deep learning is a concept conceived from machine learning. In very simple terms, we think of machine learning as a method of teaching machines (using complex algorithms to form neural networks) to make improved predictions of outcomes based on patterns and behavior from initial data sets.

Deep learning goes a step further. The idea is based around a set of techniques used to train machines (neural networks) in processing information that can generate levels of accuracy nearly equivalent to that of a human eye.

Deep learning is currently one of the best providers of solutions regarding problems in image recognition, speech recognition, object recognition, and natural language processing.

There are a growing number of deep learning libraries available for a wide range of different languages (Python, R, Java) and frameworks, such as Caffe, Theano, darch, H20 , Deeplearning4j, DeepDist, and many others.

How does deep learning work?

The central idea of deep learning is deep neural networks. Deep neural networks take artificial neural networks and build them on top of one another to form layers that are represented in a hierarchy. Deep learning allows each layer in the hierarchy to learn more about the qualities of the initial data. To put this in perspective, the output data from level one is then the input data in level two. The same process of filtering is used a number of times until the level of accuracy allows the machine to identify its goal as accurately as possible. It’s essentially a repeat process that keeps refining the initial dataset.

Here is a simple example of deep learning. Imagine a face. We as humans are very good at making sense of what our eyes show us, all the while doing it without even realizing. We can easily make out a person’s face shape, eyes, ears, nose, mouth, and other features.

We take this for granted and don’t fully appreciate how difficult and complex it can be until we write programs for machines to do what comes naturally to us. The difficulty for machines in this case is pattern recognition—identifying edges, shapes, and objects.

The aim of deep learning is to develop these deep neural networks by increasing and improving the number of layers, training each network to learn more about the data to the point that it’s equal to human accuracy.

What is the future of deep learning?

Deep learning has a bright future for sure, not that it is a new concept. I would actually argue it’s now practical rather than theoretical.

We can expect to see the development of new tools, libraries, and platforms, even improvements on current technologies such as Hadoop to accommodate the growth of deep learning.

However, it may not be all smooth sailing. It is still by far a very difficult and time consuming task to understand, especially when trying to optimize networks as datasets grow larger and larger. Surely they will be prone to errors. Additionally, the hierarchy of networks formed would have to be scaled for larger complex and data-intensive AI problems.

Nonetheless, the popularity of deep learning has seen large organizations invest heavily, such as Google’s acquisition of DeepMind for $400 million and Twitter’s purchase of Madbits. They are just few of the high profile investments amongst many. 2015 really does seem like the year deep learning will show its true potential.


Share this post:

Popular Now

Be the first to get updates

Subscribe RSS feed