Did you miss our live sessions? On this page you will find all recordings of our live sessions.
In this format our experts will introduce you to interesting facts about Artificial Intelligence, Big Data, Machine Learning, Data Management and Data Analysis and give you insights into the optimal use of the Data Intelligence Hub.
|Topic||In the live demo, we will impressively present the possibilities of the Data Intelligence Hub and explain the different functionalities. Using an example, we will also show how to create and work with models quickly and easily.|
|Participation||To the recording|
Topic: Trustworthy data exchange in the supply chain! Get to know the potential of a networked supply chain and how to use it for your company - with live demo. Sign up now!
Our Data Analytics expert answers your questions about the right data, the importance of causation & data governance in the magazin Produktion.
Data Intelligence Hub is listed by BDI as one of the leading B2B platforms for data exchange in Industry 4.0.
Umati and Telekom's Data Intelligence Hub deliver consistent standards for greater data sovereignty and efficient data exchange in the industry.
Immerse yourself in the exciting world of metalworking at EMO 2019. The Data Ingelligence Hub team invites you and provides you with a free ticket.
T-Systems with its Data Intelligence Hub is part of the Daimler EDM CAE Forum and this year it's all about communication, cooperation and inspiration.
Learn how to pull your data from the Azure blob storage and analyze it on the Data Intelligence Hub Jupyter Notebook.
Mistakes slow down production and cost money. To avoid this and to make the production process as efficient as possible, errors must be identified and eliminated at an early stage. Wouldn't it be ideal to have a digital image that identifies defects prior to production?
Intelligent mobility is being used more and more. But to create this intelligence, a large amount of data is needed. In order to remain competitive as an OEM, data must be collected, analyzed and used efficiently. But how do you actually obtain the data?