expect 3 parallel tracks - knowledge talks, tech talks & workshops. plugin will also feature an all night ML hackathon.
All timings are in Indian Standard Timing (IST).
While there is a lot of rapid advancement made in AI/ML with Auto ML, deep learning NLP, Neural nets etc. the true business benefits received fall far behind. Many a time AI/ML falls far behind to deliver on its promise to rapidly transform businesses and drive profitability. One of the key gaps is the inability to shift from point based solutions to an interconnected ecosystem of AI/ML solutions that spans and scales across the enterprise.
How can an enterprises move from an AI experimenter to operationalizing AI to being an AI driven business. While the pace can be varied the strategic shifts and technological investments needed remain fairly similar. This talk focusses on how to these shifts can be made and unleash the power of AI/ML to rapidly reap the benefits for the business.
There has always been a trade-off between “Accuracy” and “Interpretability” in the field of applied ML. The dichotomy presents significant challenges, when it comes to application of ML for creating sustainable and scalable impact for businesses. While the linear models are fairly explainable as those often depict the average behaviour based on data, the accuracies of those models are certainly sub-optimal. Non-linear models are significantly superior on accuracy, however are extremely difficult to explain. The lack of transparencies (read black box!) in complex ML models hit the trust factor amongst stake-holders hard.
In this talk Satyamoy will simplify the physical significance of the topic. He will bring in clarity around what it means for businesses and how one can leverage the progressive research in this field to drive explainability without sacrificing on accuracy. He will also walk the participants through Analyttica’s innovative IP solution addressing this field through one of the real business cases.
Data is growing like never before and every enterprise wants to be insight-driven. The need of the hour is to modernize their BI architecture by moving to the cloud, as traditional systems can neither accommodate this growing data nor deliver the desired BI performance. Join our session to learn how our “Smart OLAP” technology allows businesses to become more productive with interactive and self-service business intelligence at enterprise scale at a fraction of the cost of their traditional architecture. We’ll touch upon multiple use-cases across different industries and functions such as Customer
360, Supply Chain transformation, Viewership Analytics, Financial Analytics and so on.
You’ll also learn how Kyvos enables:
• IT to build a future-proof architecture for growing data and BI needs
• BI developers to create a single source of truth across different businesses
• Business users to slice and dice their data instantly and make quicker, better decisions
Speak about the power and behaviour change in terms of adopting the new norms of digital technology due to the environment. Data driven technologies enabling companies to reduce the impact and complementing humans in any functions like Sales or supply chain.
Liberty has been taken to expand the traditional definition of health and health industry. Its expanded scope refers to the offerings and solutions from the healthcare, wellness, and nutrition industries. The unifying bond being solutions developed on the principles of biological and life sciences. Each of them has experienced revolutions attributed to data analytics and surely they are poised for many more. Interestingly, the anecdotes and highlights from each of the industrial research centers are very different. During the talk, I would share snippets from my experiences in the different research and development groups across these industries.
Due to the complexity &; heterogeneity of the smart grid coupled with high volume, velocity and variety of data being generated, Artificial Intelligence technology has emerged to be one of the most promising &; enabling technologies for its future development and success. Aim of this presentation is to highlight various use cases &; methodologies of using artificial intelligence techniques and its impact on smart grid development.
The explosion of 'Data Science or AI as service, the citizen data scientists' has brought all organizations to the forefront of AI world. However, many organizations are unable to generate economic impact leveraging the emerging technologies. A large body of articles, papers will talks about success of models, etc. however, the economic value remains elusive. In this talk, I will discuss some of the commandments of failure.
How applied AI has become an advanced automation accelerator. As a new Normal, globally organisations can rely on Hyperautomation toolset encompassing mix tech like Process Mining, RPA and AI to overcome the challenges in driving an superior Customer Experience (CX)
The increasing availability of data accompanied by the rapid progression of AI have impacted every aspect of medicine. Advanced data analytics/AI for early drug discovery, clinical trial optimization, and business intelligence have revolutionized access to medicines for broad patient populations. In this discussion, the speaker will focus on how manufacturers and contract research organizations around the globe are adopting data advancements to enhance clinical development of drugs and improve overall patient care.
Building analytics teams can start as Artisanal or Modular. We will discuss how to select a starting point.
Analytics team evolve and grow into Hybrid organizations. We will discuss how to manage this evolution.
High performance analytics teams are unique. We will discuss how to best manage them.
The digital revolution is fundamentally changing the world as we know it—from how consumers interact and behave to how business gets done. And it’s forcing organizations to re-envision the customer experience, to re-imagine operational processes, and to re-think entire business models. The use of technology to radically improve performance provides organizations with incredible opportunities for sure, but delivering on its promise of growth and profitability requires a responsive, agile, and flexible decision-making capability that’s built on actionable insight. To get there, you need to change the way you think about, plan, and execute business intelligence.
This session is intended to help you do just that. In it, you will:
Over the last 100 years our society has made significant progress. However some social perils still exists which needs to be solved. Currently in the age of relentless momentum of technology and coding, it is often too easy to lose sight of the greater purpose of our work. While we all carry different motivations with us, I believe that we are unified by the desire to influence with our work and contribute meaningfully to solve real-life problems at very large scale.
Data and analytics leaders are in a unique position to build programs that benefit the Data for Good movement, in which people and organizations transcend organizational boundaries to use data to improve society. In this session, Anirban intends to cover how can data sciences help solve some of the pertinent problems of our society, and seek help from fellow professionals to contribute to this cause as well.
Modeling the future for real time strategic decisions for investors and decision makers in the telecom industry. It will be connected to the JIO case as well as Indonesia and other experiences.
As well I will talk about R&D efforts that exist on operator sides now days and how to manage teams of talented data scientists.
Finally I will do a little demo on how to make strategic decisions using an interactive portal.
1. Analytics Journey
2. Breaking the Analytics Wall
3. Artificial Intelligence Essentials
4. Machine Learning Essentials – Supervised, Unsupervised, Reinforcement Learning
5. Deep Learning Essentials
6. Artificial Intelligence, Machine Learning and Deep Learning – How to Distinguish between Hype and Reality
7. Artificial Intelligence Applications for Business
8. Conclusion and Q&A
A new “_” as a service model is aspiring to become just as widely adopted based on its potential to drive business outcomes with unmatched efficiency: Artificial intelligence as a service (AIaaS). Software as a service (SaaS) and Platform as a service (PaaS) have become part of the everyday tech flavour since emerging as delivery models, shifting how enterprises understand technology. Today, most companies are using at least one type of “as a service” offering to focus on their core business and spend less money on an important service. Artificial Intelligence as a Service (AIaaS) is the third party offering of Artificial Intelligence (AI). It allows individuals and companies to experiment with AI for various purposes without large initial investment and with lower risk. The combination of the SaaS business model and AI services has brought AI to the masses without a heavy price tag.
To improve the Customer Experience & increase profitability in telecom using Data Science, AI & Advanced Analytics Technologies. The primary objectives are improved operational efficiency, precise marketing, innovative business model & real-time analysis & decision making that stands on the pillars of Data Quality, Scale & Speed of execution.
The above has to be achieved on top of surging data from upstream systems such as Enterprise, Connected IoT Devices & Mobile Apps etc.
Customer Centricity is in the heart of many organizations globally today. Especially with today’s restrictions around mobility, Digital & Data becomes critical to realizing Customer Centricity. Aspects like “outside-in” design thinking, Conversational Systems for customer-facing Apps, enabling easy access to customer-facing data, etc. I’d like to talk about how Michelin thrives on its Customer Centricity strategy and the continuously increasing scope of Data & AI in the same.
I will address the importance for everyone to become data literate for the future of work. This applies both to individuals and to organizations. I will cover five major themes: data awareness (what is it?), data relevance (why me?), data literacy (show me how), data science (where's the science?), and the data imperative (create and do something with data). I will also discuss why it is important for data scientists to lead the efforts to build data literacy in society, in schools, and in professional development activities for organizations.
Sensors play a significant role in Digital Twin and productivity improvement topics of Industry. One might be curious to know if there any alternatives and how good are those solutions ? This talk focuses on adapting sound data for analyzing the performance of equipment. A live project that was implemented in Norway will be discussed.
Uber Eats has become synonymous to online food ordering. With increasing selection of restaurants and dishes in the app, personalization is quite crucial to drive growth. One aspect of personalization is better recommendation of restaurants and dishes to the users so they can get the right food at the right time.
In this talk, we present how to augment the ranking models with better representations of users, dishes and restaurants. Specifically, we show how to leverage the graph structure of Uber Eats data to learn node embeddings of various entities using state of the art Graph Convolutional Networks implemented in Tensorflow. We also show that these methods perform better than standard Matrix Factorization approaches for our use case.
Data across business domains have become large & complex due to integration with multiple latest technologies and deciphering key information from these data repositories is a time consuming key challenge in this fast modern world. Also analytics has transformed from machine learning endeavour to a business oriented lifestyle. To cater to these new requirements and newer generation of analytics we need to have a seamless platform for End to End data Analytics lifecycle. Altair Knowledge Works suite for Data Analytics provides solution for Data Preparation, Data Science and Machine learning & Stream Processing and Visualization.
In any college or university, taking attendance is a compulsory requirement for the teachers. This is a time-consuming task and prone to human error. The proposed project automates this process by using a single high resolution surveillance that sends videos to a Convolution Neural Network (CNN) to detect faces and identify the students from the video. An online Google sheet gets updated automatically, thereby providing a reliable and efficient attendance system.
Many organizations like banks and insurance companies give customers empty forms to submit their applications. The applicants fill in the applications and submit them. But eventually, the data from these forms are needed to be fed into the machine. Our product can help these organizations in updating the contents of these forms in spreadsheets with just a single ‘tap’. In this project, we considered a particular case and shown how this system can work for updating the submitted college registration forms. We have used image processing techniques like dilation, adaptive thresholding, Gaussian to remove noises and used contours to detect the boxes in the form. We trained these boxes on a modified Resnet architecture to get the final model.
Many data science and machine learning projects fail which then cause high risks for companies. In product development, there is a concept though called “lean startup” which is all about minimizing risks. In this talk, I will present how lean concepts can be used for ML projects and how both of them are actually very related to each other. I will also present a project that we’re currently working on where we applied those ideas.
This proposed talk would focus on how APAC companies prepare for the next big technology step change in AI while running a heterogeneous infrastructure environment
(edge, fog, colo, primary data centre, etc.)
Organisations adopting AI are looking for best practices gleaned from hyper-scale companies like Facebook, Microsoft, Baidu and Google who run highly efficient private and public clouds. The sharing of open hardware and data centre designs is a core strategy for these companies and the basis for the Open Compute Project (OCP).
The Open Compute Project (OCP) was started by Facebook in 2011 with the idea of delivering the most efficient designs for scalable computing through an open source hardware community. We believe that openly sharing ideas, specifications, and other intellectual property is the key to maximizing innovation and reducing complexity in
Machine learning was anyways expanding its footprint in our daily lives. The recent global pandemic has forced us to pause and rethink. And now we are in the urgent need to press the pedal of innovation, in every walk to life. Arguably, at the forefront of this endeavor has to be our ability to expand the horizons of how we think about data. Never before has ML been as important in history as today and never before has an ask been of such unprecedented scale and ways. Be it battle for living room entertainment, mitigating risks in business and supply chain models, accessing real-time data interventions, expediting data intensive medical research or even distancing ourselves from fellow beings, it’ll be the eye catcher data science that we’ll look upto to serve a quick recipe.
Narrowing the purview of the talk to computer vision, this talk, will while focusing on the examples of consumer video entertainment sector will also delve upon everything that’s right there in the playbook of video intelligence and its far reaching empowerment capabilities. Using a concise framework, we will analyze the world of ML computer vision models and talk about the wise, the ugly and the extreme ones, their ability to tackle business challenges and use case economics.
This talk would cover American Express’ exciting journey to explore AI/deep learning technique to generate next set of data innovations by deriving intelligence from the data within its global, integrated network. Learn how using credit card data has helped improve fraud decisions elevate the payment experience of millions of Card Members across the globe.
When the first settlers came to America they looked for gold. They looked in Virginia, North Carolina and Tennessee. There isn’t much gold there. They should have looked in California, because there is a lot of gold there. Today people are looking for gold in data. And most people are looking in classical transaction based systems. They are looking in the wrong place. They should be looking at text, because there is plenty of gold just waiting to be found in text. And nobody is looking there.
This presentation on data architecture looks at the broad spectrum of all of the data in the corporation, then focusses on where there is a lot of business value.
The Banking and finance domain is the early adopter of data analytics. Risk management is a million-dollar problem and the industry has evolved to use the sophisticated model to predict the bad loans, fraudulent transactions, etc. Credit risk modeling is ever-changing due to the recent popularity of ML and AI and data availability from various digital and alternate data sources.
The challenge faced by the data scientist is to improve the accuracy using sophisticated supervised learning algorithms while following the guideline of the central bank from a regulatory perspective. In the session, we would learn in-depth step by step methods to develop credit score, its use, validation of the models, and recalibration techniques.
Business decisions that rely on analytical models could be suffering from model drift. The real impacts are likely to be flawed decisions that effect the bottom line. Now, more than ever, contactless commerce is reshaping consumer behavior and the “new normal” has brought about an immediate need to adjust to the global COVID-19 situation.
The fourth Industrial Revolution, often called Industry 4.0, is now underway and there is a growing interest on how Analytics is revolutionizing decision making in Smart Factories. But, How can one leverage analytics on a shop-floor in real time to identify hidden indicators for future downtimes or identify anomalies. The talk will focus on how to access data from the shop floor through OT/IT convergent solutions and build useful analytics to tackle Asset Monitoring, Predictive Maintenance.
Modern complex machine learning and deep learning models are naturally opaque and as artificial intelligence becomes an increasing part of our daily lives, the need to trust the AI based systems with all manners of decision making and predictions is paramount. Explainable AI refers to methods and techniques in the application of artificial intelligence technology (AI) such that the results of the solution can be understood by human experts. It contrasts with the concept of the “black box” in machine learning and enables transparency. On a global level, this means that we understand which features the model is using, and to what extent, when making a decision. For each single feature, we would want to understand how this feature is used, depending on the values it takes. And on a local level, that is, for any individual data point, we would want to see why the model made a certain decision. This can give us more insight into where and why the model might fail.
In this talk, Shashank is going to discuss a game theory based technique for AI explainability.
1. Historic data and it's collection
2. Variable selections
3. Handling correlated variables and missing value
4. Calculating player strength
5. Prediction of team winning or losing a match based on team strength (derived from player strength)
Being specialized in domains like computer vision and natural language processing is no longer a luxury but a necessity which is expected of any data scientist in today’s fast-paced world! With a hands-on and interactive approach, we will understand essential concepts in NLP along with extensive case- studies and hands-on examples to master state-of-the-art tools, techniques and frameworks for actually applying NLP to solve real- world problems. We will leverage machine learning, deep learning and deep transfer learning to solve some popular tasks in NLP including the following:
-Basics of Natural Language and Python for NLP tasks
-Text Processing and Wrangling
-Text Representation – BOW \ Embeddings (a brief)
-Topic Modeling – Case Study on trending topics in research papers from NIPS proceedings
-Text Summarization – Factorization and Deep Transfer Learning Methods
-Text Classification with Machine Learning & Deep Learning \ Deep Transfer Learning
-- ML methods o RNNs\LSTMs o CNNs
We as Data Scientists might be competent enough to build very complex and intricate Data Science algorithms BUT if there is one skill that most of us lack, it’s deployment. And unless deployed, our model is as good as a simple demo. To scale or to even put into production, it needs to deployed as an API or embedded into the existing systems.
Goal - During this talk, we will learn about AWS Lambda and Amazon API and how we can use them to deploy our Data Science models.
Talk will include a live demo during we will -
1. Understand & create a lambda_function
2. Download & define layers(aka dependencies)
3. Test our function
4. Configure Amazon API gateway
5. Deploy the API and test it using Postman or Curl
Businesses across domains are looking for sharper tools and techniques to identify and predict risk, and develop data driven strategies to mitigate that risk, which be in the form of financial default, customer attrition, fraud, employee attrition, etc. New age machine learning techniques have provided businesses tools to sharpen their focus on predicting risk, though the adoption of these techniques to drive business impact at scale, is slowed by the need for explainability, especially in regulated industry domains like banking, insurance, healthcare, etc.
This hands-on session will focus on a step-by-step approach to build a risk framework, by applying multiple machine learning techniques and the application of XAI to explain the factors driving risk at a local level. The session will be delivered via Analyttica TreasureHunt® LEAPS (leaps.analyttica.com), one can register to ATH LEAPS using their email ID and follow and execute the approach as part of the session.
-An Anaconda development environment on the attendee's machine.
-A Jupyter Notebook installation. Anaconda will install Jupyter Notebook for you during its installation. You can also follow this tutorial for a guide on how to navigate and use Jupyter Notebook.
-Familiarity with Machine learning.
for artificial intelligence & data science