The AI and data science terminology you need to know

By Stuart Davie and Tom Hassall on July 15, 2019

Artificial intelligence. You’re probably hearing these two words more and more in a business setting, not to mention in your everyday life.

From asking Siri to skip that annoying song that somehow keeps coming up on shuffle, to powering data-driven decisions for your company, AI is becoming more and more commonplace – and this trend is only going to go one way.

As a starter for ten, here’s some key data science terminology to be aware of if you’re looking to learn more about AI and its benefits!

Artificial intelligence
This seems like a good place to start! At Peak, we think of AI as a general term that refers to computer hardware or software that behaves in a way which appears intelligent. The term was first coined by Dartmouth Assistant Professor John McCarthy way back in 1956 when he said: “AI is the science and engineering of making intelligent machines, especially intelligent computer programs.”

Computational neuroscience
This term was first used at a conference by Eric L. Schwartz in 1985, who was providing a review of a field that had no previous name. Computational neuroscience explains how electrical and chemical signals are used in the brain to represent and process information.

Cybernetics
Cybernetics is the study of how information is communicated in machines and electronic devices – which is then compared to how information is communicated in the brain and nervous system.

Data science
Data science is at the heart of what we do here at Peak. It combines domain expertise with programming skills, as well as mathematical and statistical know-how, in order to extract valuable insights from data. We apply machine learning algorithms to a whole host of data types to allow our AI System to execute tasks that would ordinarily require human intelligence, with a focus on delivering profitable outcomes for businesses.

Neural networks
Neuroscientists have discovered that the brain is composed of billions of interconnected processing units called neurons. A neuron receives inputs from multiple sources, integrates that information, and sends an output to many other connected neurons: a neural network. Due to the highly interconnected nature of a neural network, it’s ideal for learning and recognising relationships between many related pieces of information. Because neurons are analogue processing units, they don’t rely on strict logical rules like a normal computer program, and can therefore adapt to new situations when they arise. Neural networks have found use in many fields such as computer vision and speech recognition, but they can also be used to learn and predict relationships between many other types of real-world information streams like market trends and patterns of consumer behaviour. Peak takes advantage of the power of neural networks to process and understand vast quantities of data to help businesses make better business decisions.

peak-data-science-team

Deep learning
Neural networks benefit from having multiple layers. For example, if an image recognition neural network is shown a picture of a white fluffy cat, the first layer might recognise nothing more than the presence of fluffiness in the image. The next layer might recognise the eyes and ears, and as deeper layers are added, the model can begin to piece together the general patterns of fluffiness, eye position, ears and general cuteness required to recognise that this is a cat and not a dog. Deep neural networks therefore enable deeper insights in a given dataset and can sometimes outperform humans in decision making and planning.

Fuzzy logic
Fuzzy logic was a term coined by University of California’s Dr. Lotfi Zadeh back in the 1960s. It’s defined as an approach to computing that’s based on the idea of “degrees of truth”, as opposed to the usual black and white, “true or false” logic that modern computers are based on.

Machine learning
Also an increasingly-used term in the world of business, machine learning (often abbreviated to ML) is an application of AI. It allows systems to be able to learn automatically, improve from experience, and get smarter over time, without being explicitly programmed in a certain way.

Natural language processing
Often abbreviated to NLP, natural language processing is a branch of AI that helps computers to understand and interpret human language – both text and speech. So, when you ask Siri to skip that aforementioned earworm, that interaction is made possible because of NLP – the AI recognises your voice, understands the action that you’re requesting, executes the action, and responds appropriately within seconds.

If you’re interested in how NLP can be beneficial in a business setting, be sure to check out our case study with multinational pharmaceutical firm GSK.

Skynet:
Uh oh. How did this one sneak in here? The evil antagonists from the Terminator franchise probably don’t give off the best first impression when it comes to AI, but rest assured that Peak’s approach and our solutions are a lot less terrifying – honest.

If you’re interested to learn how you can apply some of the above data science terminology in your business to drive growth and deliver value, we’d love to hear from you.

Sign up to the Peak newsletter

Get the latest Peak news and AI insights delivered straight to your inbox

Subscribe today!