AI and Machine Learning Glossary

Algorithm: An algorithm is a process or set of steps to follow to solve a problem. In AI and machine learning, there are a number of different algorithms for solving complex problems -- your choice depends on the question at hand (are you making predictions? finding patterns? looking for connections among bits of data?) and the type of data you have available.

Artificial intelligence (AI): Artificial Intelligence, or AI, refers to a computer's ability to process information, find patterns, make decisions, and even predict future outcomes – in essence, to function similarly to a human brain. Marketing teams can use AI to process enormous amounts of customer data to deliver customized user experiences and to predict customer needs and behaviors.

Collaborative filtering: BlueConic uses collaborative filtering to drive its content and product recommendation systems. Collaborative filtering sifts through vast amounts of data to find patterns among similar users, in order to predict the best matches or items for users who exhibit similar habits, behaviors, or choices.

Data processing for AI: Before data can be used for training a model, it is processed. For example, numbers are normalized to increase the performance of the machine learning algorithm. One cool thing about doing this with BlueConic’s AI Workbench is that data scientists don’t have to spend nearly as much time processing the data as they would normally have because a lot of the data is already transformed. BlueConic gives you a single view of the customer so you don’t need to stitch data from different files together to try and create that view.

Since you can also use the full power of the Python programming ecosystem with AI Workbench, complex transformations are easy to apply to BlueConic profile data. There are instances where you will still have to spend time transforming data – like converting a date to the number of days since that date because that’s more complex.

Jupyter notebook: In BlueConic, the AI Workbench uses the Jupyter notebook environment for creating and running machine learning models against BlueConic data. Learn more: Jupyter notebooks

Machine learning: Machine learning is a branch of AI that uses algorithms and models trained on thousands or millions of pieces of data to help businesses to make better decisions or predictions. BlueConic uses machine learning algorithms to determine optimal product and content recommendations.

Model: A model is a data structure that represents a real-world process (for example, the relation between visits, page views, and propensity to buy). You’re basically codifying your hypothesis of how (a small part of) the world works. Let’s say, for example, that your hypothesis is ‘the number of page views and visits of a customer determines how likely that a customer is to buy something.’ In that case, the relationship between page views, visits, and whether the customer bought something is the model.

Models are at the heart of an AI and machine learning workflow:

  • Building AI models: In AI Workbench, data scientists build a custom model or import one from a Python library. Using the embedded Jupyter notebook environment, teams can build their own models, or tweak any existing models within the BlueConic library. If you're not familiar with Python, you can also use the built-in example models BlueConic provides for propensity scoring and calculating customer lifetime value.
  • Training AI models: Typically, you train a machine learning model by giving it an initial set of representative data to train the model to recognize patterns, calculate scores, etc. Next you test and validate the model, and then apply it to larger datasets to test your results.
  • Optimizing AI models: A key step in the workflow is to optimize your model once you've seen how it runs with real customer data. Using Jupyter within BlueConic gives you access to unified customer data – which updates as the customer’s attributes change. Your models can access to the most up-to-date data because it’s pulling right from BlueConic's persistent, dynamic profiles. Use the insights from your model to create smarter segments based on the model's output, for example, prediction scores.
  • Deploying AI models: Deploying a model means you’re sending the outcome of the model to your external marketing systems. Because models are being run in BlueConic, you can deploy the model across all your marketing system because BlueConic is already connected to all your activation channels. With AI Workbench, you can attach scores as profile properties to activate segments in real time. It also enables you to do lookalike modeling based on changing behaviors, for example.
  • Scheduling AI models: To schedule a model means you can determine the time period in which the model is run. Currently, it’s difficult for data scientists to ensure predictions are kept up-to-date because their models operate on static data. With AI Workbench, marketing teams can schedule a Jupyter notebook, which automatically ensures they can use the latest predictions in their segments.

Parameters: AI Workbench provides prebuilt AI models and examples for marketing teams to gain insights into their BlueConic data in real time. Data scientists can also write custom Python code in Jupyter notebooks, which defines parameters and what type of value they accept (string, integer, BlueConic profile property, BlueConic segment, etc.).

Marketing teams use the AI Workbench UI to supply values for the parameters of the AI models and analytics – without ever having to write any code. For example, to run an AI model that predicts customer engagement levels, you might feed order history and recency into the model as inputs. The profile properties 'order history' and 'recency' and the resulting output are parameters in AI Workbench. Marketing teams can easily update these values in the Parameters tab, without having to know Python.