The Mission

Our mission is to provide all AI enthusiasts and professionals with as much applicable knowledge as we can. In order to successfully do so, we gathered and created the following content for you: 

  • The Bible
    In simple terms, we created a 20 page document that contains all knowledge about successful AI business implementation. The Bible also contains all the knowledge areas where you will be tested on during the AI Essentials Assessment. 
    Note: All of our content is strictly viewed from a business perspective, for a more technical elaboration or certification, please visit our referral page

  • The Framework
    We've designed our own framework for Managers and Product Owners to easily manage a client-first-orientated team and drive results by getting successful pilots started. 

  • The Open-source Collection
    To assist organisations in finding technical solutions, we've constructed an independent and comprehensive collection of open source materials, plugins and API's. With the help of this collection, your team will not only have a better overview of the market, but will also have the tools to easily design and build machine learning models for your business.


The AI Bible

The AI Bible includes all content needed in order to successfully complete the AI Essentials Assessment and become an industry acknowledged certified AI business implementation expert. We provide all content free of charge, so everyone (certified or not) can benefit and has access to knowledge that helps accelerating AI innovation.  

The Bible covers the following topics:

Theory Pages:

  • History and Definition

  • Machine Learning Definitions

  • Data Pre-processing

  • Eight different Machine Learning methods

  • Deep Learning / Artificial Neural Networks

  • API’s

  • Data Security & Ethics
     

Management Pages

  • Organisation Readiness

  • Hardware Requirements

  • Agile Scrum

  • AIcompany Talent Framework (ATF)

  • AIcompany Pilot Management Framework (APMF)

  • Further Readings


The AIcompany Pilot Management Framework (APMF)

AIcompany designed a framework to help managers get a quick-scan selection of all possible processes where Artificial Intelligence would have a great impact. Ideally, we look for projects and models that would be able to make a difference and deliver business value within 12 months or less. The AIcompany framework mainly focuses on the ‘what’ questions instead of the ‘how’, more technical aspects.

Most Pilot frameworks and tools start from a company goal, vision and mission perspective; what does the organisation wants to accomplish. However, we believe that with the help of APMF, organisations are able to get highly innovative innovation initiatives started that could have a great, and often unexpected, impact on the entire organisation.

 

 

Step one: Client First?

We see organisations starting machine learning initiative because they see an opportunity to generate more revenue (Return on investment driven). The APMF requires you to start identifying from a customer satisfaction point of view. Not only will it strengthen the business case but will also help you find support within the organisation. Therefore, write down three (maximum) or less Client First themes where an increase of customer satisfaction (i.e. NPS scores) will have great impact (Core Business). These themes should be described however your customers /clients perceive the full service / product. For example, if you sell mortgages, please remember that customers are not looking for mortgages, most likely, customers would much rather get a house without the burden of a mortgage.  A question that might help you to come up with customer satisfaction themes; ‘Where would our customer really appreciate a more proactive approach?’

 

Step two: What must be done manually and repeats itself often?

Answering this question would in most organisations result in a long list of activities and processes. Please try to group and label them, keep it simple (measured in time, hours for example, %).

As mentioned, search for procedures and decisions that are made frequently and consistently. We recommend businesses to start looking for suitable core processes. Try to collect as much data as possible about how the decision was made. In the example of the loan approval, what customer data was important for approving the loan? (Who made it? At what time of day? How confident did they feel in the decision?).

 

Step three: What do we actually know?

Very important, identify what data is available and is currently stored. Similar data sets might have been stored in different formatted labels or back office systems. Are processes or activities labelled? In what quantities? Is the data structured or unstructured?  Are there any intermediate results stored, or linked overall results? What kind of job title or person made the decision? Again, how confident did they feel in this decision? All these answers will help you determine what kind of model to use and how to make it accurate.

 

Step four: Design your model

Step four is considered the most time consuming step. Here it comes down to the creativity and skillset of the AI specialist or Data Scientist you are working with in order to deliver value. What could an entire dedicated team of the best employees do for this one special client? What data would be required to design an appropriate learning model that will have a high accuracy performance? Design a model that is able to predict, handle, advise and support this delivery. Remember, finding the perfect fit requires time and many Trial-and-error’s. The pitfall we see often encounter in real world organisations is a project team that started out with a highly complex model to solve a problem that can be easily simplified and solved by a relative simple model.  Unsupervised machine learning might be the end goal (agents), but try to combine methods. For example, while doing market research you might want to segment consumer groups to target specific website behaviours, a clustering algorithm will be sufficient. Apply clustering techniques to derive smaller number of features and use those features as inputs for training a classifier model.  Image 4 shows a machine learning workflow we often see in real world organisations.

 

Step Five: Select your dreamteam

(See AIcompany Talent Framework for details) Please do not hesitate or be shy in creating the pilot dreamteam. In Large organisations we noticed that even a one-hour commitment from the right domain specialist could make a huge difference. Selecting your team is also a great way to create ambassadors across several departments and divisions.

 

Step Six: Pilot Time

The pilot should demonstrate what a successful implementation of a machine learning model would look like. Deployment in a production system is recommended. How accurate is the machine learning model performing to new real world data? During the pilot there should be more than enough data and feedback from end-users available that allows you to fine-tune the business case, and rethink return on investments for further scaling.

Good luck! Please do not hesitate to contact us if you would like to have more information or want to see practical cases (in your industry) where APMF designs were implemented.

Workflow.png

Open Source Collection

One of the greatest developments (and one of the reasons we started AIcompany) is the vast amount of open source materials available. From entire labeled datasets to plug and play algorithms, so much is available.

We've gathered as much knowledge as possible and provide the links to find them and actually start implementing!

The collection is ordered in backend software, machine/deep learning software (or both backend and machine learning in one) and datasets. These are the three layers needed for a functioning AI algorithms.

The collection is ordered in backend software, machine/deep learning software (or both backend and machine learning in one) and datasets. These are the three layers needed for a functioning AI algorithms.

IBM Watson
Watson, built by IBM, is quite different from the other players we provide here. We chose to include it since it is considered to be one of the most dynamic and multi-functional AI models ever built. It's not open source, it's actually pretty expensive, but Watson is an easy to use, plug and play AI that can be used by a great amount of businesses and is able to connect to a lot of different API's. If you have a significant budget and don't want to waste effort and resources on building your own AI algorithm, it is a good choice. Note, however, that Watson is designed to incorporate one step processes. For niche algorithms which involve a lot of specific iterations, Watson might not be a great option. Watson is however performing great on unstructured data. 

https://www.ibm.com/watson/


TensorFlow
Google's TensorFlow is currently the most popular open source software for AI algorithms. As a child of Google Deepmind, it is used by Google products and research. It is classified as 'back end' in the collection, because it is used as the architecture of a neural net in general and not as a specific machine or deep learning algorithm.

https://www.tensorflow.org/


Theano
Theano is an open-source symbolic tensor manipulation framework developed by LISA/MILA Lab at Université de Montréal. It is more or less, the direct competitor of TensorFlow.

https://github.com/Theano
 

Scikit-learn
We didn't put Sklearn in the backend part, because many of it's plugins and API's that can be used on top of Theano or TensorFlow. Sklearn can, however, be used independently and work on its own.

http://scikit-learn.org/stable/


Keras
Keras is a high-level neural networks library, written in Python and capable of running on top of either TensorFlow or Theano. It was developed with a focus on enabling fast experimentation. Being able to go from idea to result with the least possible delay is key to doing good research. Keras is mostly used for deep learning models.

https://keras.io/


Deeplearning4j
Deeplearning4j is the first commercial-grade, open-source, distributed deep-learning library written for Java and Scala. It is designed to be used in business environments, rather than as a research tool.

https://deeplearning4j.org/

Datasets
Next to a interesting amount of open source software, a great amount of online datasets is available for free. For companies that are young and/or don't have sufficient amount of usable data, this is a relaxing thought. These sets can be used to train your self-build learning algorithm, so they'll be perfectly adjusted for your purpose.

Here are some resources:

https://archive.ics.uci.edu/ml/machine-learning-databases/

https://archive.ics.uci.edu/ml/datasets.html


We will keep expanding the content on this page and update it weekly to keep up with the rapidly changing market trends of AI.