Anomaly Detection based on Prediction - A Step Closer to General Artificial Intelligence Anomaly detection refers to the problem of finding patterns that do not conform to expected behavior [1]. In the last article "Understanding Neocortex to Create Intelligence" , we explored how applications based on the workings of neocortex create intelligence. Pattern recognition along with prediction makes human brains the ultimate intelligent machines. Prediction help humans to detect anomalies in the environment. Before every action is taken, neocortex predicts the outcome. If there is a deviation from the expected outcome, neocortex detects anomalies, and will take necessary steps to handle them. A system which claims to be intelligent, should have anomaly detection in place. Recent findings using research on neocortex have made it possible to create applications that does anomaly detection. Numenta’s NuPIC using Hierarchical Temporal Memory (HTM) framework is able to do inference...
Posts
From Jeeves to Jarvis (and Bertie to Mark)
- Get link
- X
- Other Apps
Mark Zuckerberg wants to build a robot that can serve as a personal home assistant, like a valet or butler. This is his resolution (he calls it ‘personal challenge’) for 2016. https://www.facebook.com/zuck/posts/10102577175875681 He likes to compare his Artificially Intelligent (AI) butler with Jarvis, the fictional character from Marvel films Iron Man. In the movie, Jarvis (Just a Rather Very Intelligent System!) is built by the Iron Man himself, using AI technologies. Butlers have a come a long way since Jeeves, the fictional faithful manservant created by the legendary author P. G. Woodhouse. Jeeves not only takes care of his master Bertie Wooster, but provides ingenious solutions to all of Bertie’s problems, from selecting the right socks to rescuing him from breach of promise suits. Zuckerberg’s AI butler would help him to run his house and help with his work too. He would understand Zuckerberg’s voice and control music, light, temperature and so on. The but...
- Get link
- X
- Other Apps
GPU - The brain of Artificial Intelligence Machine Learning algorithms require tens and thousands of CPU based servers to train a model, which turns out to be an expensive activity. Machine Learning researchers and engineers are often faced with the problem of running their algorithms fast. Although initially invented for processing graphics in computer games, GPUs today are used in machine learning to perform feature detection from vast amount of unlabeled data. Compared to CPUs, GPUs take far less time to train models that perform classification and prediction. Characteristics of GPUs that make them ideal for machine learning Handle large datasets Needs far less data centre infrastructure Can be specialized for specific machine learning needs Perform vector computations faster than any known processor Designed to perform data parallel computation NVIDIA CUDA GPUs today are used to build deep learning image processing tools for Adobe Creative Cloud. According to NVID...