With over a decade of experience, Deep Learning Partnership offers a range of AI consulting and training solutions. Together, you and our team of world class Machine Learning Engineers and AI Solutions Architects will identify how AI can help your company increase productivity and achieve sustainable competitive advantage. We will identify what data you have, what you need, how to source the data gaps, and then fine-tune open source models to provide you with custom LLMs for the various parts of your business. This can include operations, management, new products and services, HR, and sales and marketing.
AI technologies we use include LLMs, computer vision, reinforcement learning, AI agents, Gemini, ChatGPT, LangChain, vector databases, Pinecone, ChromaDB, RAG, TensorFlow, PyTorch and JAX. We specialize in training and deploying custom models for our clients using the latest frameworks, guardrail techniques and best practices to ensure your AI implementations are safe, secure and compliant. We take GRC (governance, risk & compliance) very seriously.
We have accumulated over a decade of experience in growing and managing agile AI practices along with building key client and partner relationships. Deep Learning Partnership design, test and implement end to end AI and MLOps solutions for our Enterprise, SME and start-up clients across all business domains including healthcare, finance, transportation and energy.
Our training courses below reflect our Consultants' wide range of expertise. Our Consultants are also available to speak at conferences and industry events on the latest developments in artificial intelligence.
Overview of Generative AI, its history and development, current and future frameworks and use cases, along with unpacking the underlying technologies. We will take a look at various generative AI frameworks, both open and closed source, including Gemini, ChatGPT, Llama-2, Mistral, stable diffusion and Dall-E3, along with actual business applications. We will examine the various types of generative AI including transformers, diffusion models, GANs and VAEs, along with cloud platform offerings including AWS Bedrock, Google Vertex AI & Azure OpenAI.
There will be opportunity for some high-level hands-on exploration including worked examples, fine-tuning, AI agents, and prompt engineering. This is intended as an overview class for technologists and business leaders who wish to gain a better understanding of generative AI technologies and their potential for business disruption and competitive advantage.
In this course we take the attendees through an overview of AI. Definition, a brief history, latest developments including generative AI and LLMs, and how it will impact business and society in general. We cover the technology basics as well as how AI is being used by companies from start-ups to multinationals to increase productivity, develop new products and services and to improve on existing ones.
We examine data sets, hardware, models, agents, and MLOps platforms, and then focus on business use cases and scaling to real world AI deployments in production environments. This course is high level so is suitable for business executives and people curious as to what the AI world looks like from a platform and business perspective.
In this course with an 80/20 practice/theory split, we will look at the techniques and use cases of multimodal large language models (LLMs), including speech and text generation as well as image and video generation. We will examine some of the popular multimodal large language frameworks from Google, OpenAI and Meta such as Gemini, GPT-4 and LLaMA, respectively. We will look at how businesses are using LLMs to gain competitive advantage by considering several use cases. The labs will be done using the above frameworks so previous exposure to Python is advisable.
We will configure LLM frameworks on the cloud and then fine-tune them using various multimodal data sets. Syllabus includes:
• Understanding large language model architectures
• The changing landscape of multimodal LLMs
• Understanding embedding techniques
• Vector databases & the orchestration layer (LangChain)
• AI agents & retrieval (RAG)
• Training and serving LLMs
• Fine-tuning LLMs, RLHF, prompt engineering
• Testing and evaluation
• Deploying and scaling LLMs in production
• Cloud LLM services & MLOps
Exposure to Jupyter notebooks and python programming desirable.
We will do a deep dive into generative AI as a productivity accelerator including hands on with some of the popular LLM frameworks and use cases. Approximate schedule is:
Day 1 - ANNs, Generative AI and foundation model overview, including generative AI cloud service offerings from AWS, Google & Azure
Day 2 - Hands-on with LLM frameworks including GPT-4, Gemini, Mistral and LLaMA2
Day 3 - Vector databases, embeddings, AI agents, RAG, fine-tuning, and other LLM ecosystem components and techniques
Day 4 - Several exercises and code walkthroughs using all of the above frameworks and techniques, and how to integrate these into business workflows
Day 5 - Building a mini-project using any of the frameworks we have considered so far plus any others participants would like to use.
This is a business focussed introduction to Generative AI and LLMs which will include concepts, use cases and code walkthroughs to demonstrate examples using real world data sets.
Since Google open-sourced their Deep Learning framework TensorFlow, it has become the most popular tool for Data Scientists and Machine Learning experts worldwide. In this course we will explore the various facets of TensorFlow – expect an 80% lab content with this course, both locally on your laptop, and in the cloud.
We will cover various business use cases with case studies and code walkthroughs using Colab notebooks. Some exposure to Python, TensorFlow and Jupyter notebooks is required.
In this intensive five-day hands on TensorFlow, JAX & PyTorch course, we will explore how different businesses are using artificial intelligence and generative AI to improve profits and productivity. We will explore, through theory and labs, how the ML frameworks TensorFlow, JAX and PyTorch are being used to classify images, recognize text and speech, generate images and language and make predictions from data to help companies make more efficient and effective business decisions.
We will examine business use cases using a variety of image, text, speech and video data sets. Hands on lab instruction will be done both locally and in the cloud, using GPU's, TPU's, LLMOps and generative AI frameworks. There will be approximately a 70% lab element to this course.
A one day overview of the main concepts and technologies underlying the practice of data science with code walkthroughs. We will cover the basics from business cases to which technologies to use and when to use them and how businesses are leveraging them today to drive innovation and productivity.
From data ingestion and engineering to machine learning and generative AI, see how the latest technologies are differentiating businesses from their competitors through the exploration of real-world business scenarios.
Consisting of an 80/20 split of labs and concepts, this five-day data science course provides a comprehensive exploration of all of the main aspects of data science applied to business. Going into much more depth than our one-day overview course in terms of both theory and practice, topics covered include strategy and planning, data creation, ingestion and cleaning, data analysis using frameworks such as Kubernetes, TensorFlow, PyTorch, JAX and AutoML, using LLM tools like GPT-4 and LangChain to innovate new products and services, and finally business case presentation and recommendations. Several real-world business use cases will be covered.
Prerequisites include hands on familiarity with Python, TensorFlow and Jupyter notebooks.
This three-day course will introduce participants to the major recent advancements in the field of reinforcement learning and how these developments can be applied in organizations to build intelligent systems and improve business processes. It will explore various real-world scenarios to implement some of the latest algorithms for building intelligent applications and services using the Dopamine framework. We will also cover how RLHF is used in generative AI applications.
As well as exploring the various RL frameworks and algorithms, participants will gain an understanding of what RL algorithms to use in a given business context. Domain examples will include self-driving cars, manufacturing (robotics), medicine and financial services. As this course comprises a 70% lab element, some exposure to Python and TensorFlow is desirable.
In this three-day hands-on course, we will explore Julia’s capabilities and see why it has fast become a favoured language for data science and machine learning aficionados along with Python and R. As well as getting to know the language and some of its main statistical libraries, we will apply it to analysing various business data sets to produce meaningful business outcomes and predictions.
We will particularly focus on the deep learning packages such as Flux.jl and TensorFlow. This course will consist of a 70/30 mix of practice and theory with Labs being performed using Jupyter notebooks. Some prior Julia programming experience is desirable.
Massive (e.g., petabyte scale) data sets require massively parallel processing in order to do timely analysis. GPU’s & TPU's are well suited for this task, and in these practical hands on course, we will learn how to program them to extract useful information.
We will configure GPU instances on the cloud and then use them to analyse various business data sets. We will look at how TPU's are being used in the Google Cloud Platform and examine other AI ASICs available on the market. 80/20 practice/theory. Some exposure to Python and Jupiter notebooks desirable.
12-week bootcamp exploring how businesses are using generative AI for competitive advantage and the technologies underlying this. There will be extensive labs each week so applicants should be comfortable with programming.
Week 1 - Basics of artificial neural networks (ANNs) - Data sets, algorithms, models, hardware, generative AI, foundation models, LLMs & LLMOps. Intro labs.
Week 2 - Frameworks including TensorFlow, JAX, PyTorch, GPT-4, Gemini and several open source LLMs.
Week 3 - CNNs, NLP, RNNs, LSTM's, Transformers, LLMs and time series data.
Week 4 - Generative AI deep dive - LLMs, foundation models, VAE, GANs, transformers, diffusion models, open vs closed source models, pre-training.
Week 5 - Further generative AI including hands on with GPT-4, Gemini, Mistral, Dall-E3, CLIP, stable diffusion, StableLM and more. Fine-tuning, RLHF, prompt engineering, distillation, quantization & evaluation.
Week 6 - Deep reinforcement learning, generative AI, AI agents, RAG, and the AI orchestration layer.
Week 7 - Generative AI ecosystems - includes various plugins, langchain, vector databases, embeddings, AutoGPT, various agents, Hugging Face, along with business applications.
Week 8 - Deploying your generative AI solution to production - LLMOps, AWS Bedrock, Google Vertex AI and Azure OpenAI.
Weeks 9-12 - Capstone Project - Students work with a company on a commercial LLM project with the view to getting hired upon completion.
Prerequisites: Math (linear algebra, calculus and probability), programming skills in python, comfortable at the command line. Some previous exposure to a major ML framework such as TensorFlow, JAX or PyTorch desirable.
Neuromorphic computing is an emerging computing paradigm, which uses an analogue processor for spiking neural networks, much the way the brain does computations. Neuromorphic processors utilize massively parallel computations in their synaptic connections between the artificial neurons and run at very low power (1000x reduction in power consumption over CPU's, for example).
Despite sounding esoteric, this technology is seeing commercial application in companies and government organizations today. In this two-day overview course, we will examine the technology from theory to practice, survey the past and present research and product landscapes, take a look at the various product offerings available today and the differences between them, and cover some case studies to show how companies are already reaping performance benefits from this exciting new technology. Basic python programming is required.
Starting off with a comparison between Bayesian and classical statistics, this course introduces the participants to the basics of Bayesian statistics including Bayesian probability and inference. It will cover theory and practice with some hands-on labs in python so that the student can get experience with analysing actual data sets using Bayesian methods.
By now you've heard and read about quantum computing, and are wondering if your business can use it to gain competitive advantage. In this half day workshop, you will learn about where exactly quantum computing is in terms of enterprise readiness and how businesses can harness what is available today.
We will look at various real-world use cases where companies are currently using quantum computing, including in the transportation, financial, energy, materials and pharmaceutical sectors. By the end of the session participants will have a clear understanding of what quantum computing is, where it is being applied today, as well as being provided with a quantum roadmap.
We examine data sets, hardware, algorithms and full-stack platforms, and then focus on business use cases and scaling to real world AI deployments in production environments. This course is high level so is suitable for business executives and people curious as to what the new AI-first world looks like from a business perspective.
In this overview course, we will explore quantum computing including theory (quantum physics without a lot of maths), hardware, quantum algorithms and software frameworks, as well as looking at quantum computing services available on the market today.
We will explore the various hardware vendors' cloud service offerings including D-Wave, IBM and Rigetti. We will also look at some QC use cases within the domains of machine learning, chemistry and optimizations. Ideal for businesses who wish to become quantum ready.
We will look at the theory behind quantum computing as well as the practical aspects of building and programming a quantum computer. After covering the various types of quantum computing hardware, we will look at the frameworks available today and how they are being used across different business domains.
There will be a hands-on element whereby we will use a quantum computer to program a selection of quantum algorithms using the Qiskit framework. Familiarity with the Linux command line as well as the Python programming language is required. There will be a roughly 30/70 split theory/hands on.
8-week bootcamp exploring how businesses can use quantum computing for competitive advantage. There will be extensive labs each week in Qiskit. Get in touch for further details and availability.
Copyright © 2013-2024 Deep Learning Partnership - All Rights Reserved.
Powered by GoDaddy Website Builder