Data is the new gold – and you can mine it with the Data Intelligence Hub: Use this innovative platform to securely and efficiently exchange, process and analyze data. Retain full sovereignty over your data. Our vision: the development of new data-driven services and applications for machine learning or artificial intelligence.
Get better results with more information – effective data sharing enables you to develop new, future-oriented, competitive and data-driven business models.
Exchange and use: The Data Intelligence Hub can help you turn existing or acquired unstructured data into business-crucial insights with AI and analysis specialists.
Your company’s data is a valuable asset: Accessing your own database with the Data Intelligence Hub allows you to tap additional sources of revenue in a simple, transparent and secure manner.
Exchange and process data both in and beyond your own sector: The marketplace provides entirely new ways to execute your data strategy.
The Data Intelligence Hub offers a wide spectrum of proven analysis tools. Work on your own or collaborate on projects – regardless of your own hardware infrastructure.
Publish your data and analyzes individually: Who is allowed to view them, who to explore, who to process? All could be managed at once in the Data Intelligence Hub.
Exchange data with all of your business partners along the data value chain and across international borders. You can rely on our infrastructure and security standards.
Receiving and providing are getting quite simple through the Connector. The application based on the open source software Apache Kafka provides an interface for loading and processing data streams. Especially for large amounts of data and peer-to-peer install the connector locally on your server.
How to fix a problem like the coronavirus crisis, ultimately? With the right solution, medicine and vaccine. How do you find the right medication? Testing! How do you speed up the process? Simulation! But it has to be backed up with science and a rigorous experimentation process. Like we have done to identify solutions to the traffic problem in dense urban areas such as Berlin.
For a smooth “just-in-time” material delivery, orders as well as participants musst be coordinated. Systematic data processing helps with this. Everyone receives the relevant information and a translation service prevents misunderstandings.
The collaborative use of data offers valuable insights to all parties along the value-chain, which can be used to optimize products and processes.
Hollywood already invented them: avatars, cyborgs, androids, clones. Today, leading politicians, managers and investors speak about the Digital Twin as a reflection of people as something perfectly natural. But where remains the data sovereignty?
New last mile technology has arrived, think e-scooters, smart parking apps … but intermodal mobility hasn’t. Are there no benefits for the end-user? Let’s find out … with data science.
With all the attention on big data, it is easy to overlook the AI potential of small or unbalanced samples – or “small data”. Cloud trends like edge-computing open new opportunities.
Farmers are highly dependent on weather and soil. Smart farming combines, for example, agricultural data with weather data in agriculture to be able to take timely measures against negative effects. Thus, product quality, quantity, and pricing can be optimized.
In viticulture, external factors such as the weather play a major role in determining the harvest yield. Through the analysis of past and current weather data, linked with the weather forecast, IoT devices support the planning and optimization of the vintage.
Intelligent building management helps to use energy, water and other resources in an optimal way to lower costs for owners as well as residents.
How can street lighting be regulated with the lowest possible power consumption? How to know when a certain lamp will fail? By including the local weather and material lifetime into the calculation.
Chancellor Merkel urged German enterprises to utilize the potential for value that lies in data, Otherwise, a rude awakening is to be expected. But how can the transition into a data-oriented organization succeed? To implement product management might be a good start.
We spend more than four years of our lives inside a car. If we are stuck in a traffic jam for 25 percent of the time, then we age 1 year faster … unless we spend the time in a useful way… maybe new technology can help.
At the moment, roads and parking dominate the scenery of inner cities. The arrival of new forms of mobility, however, will radically alter the consumption of urban space.
Mistakes and errors slow down production and cost money. How to prevent it? Wouldn't it be ideal to identify defects prior to production? “Frontloading” with digital twins can do the trick.
There is a lot of open-source data out there, for instance on population density and means of transport. However, a differentiated picture and prognosis are only possible with dynamic data on movements.
With consistent standards, Umati and the Data Intelligence Hub enable efficient data exchange in the industry.
Intelligent mobility applications require large volumes of data. An OEM that sources, analyzes and utilizes this data, has a good chance to stay competitive.
There are various dependencies between the players in beverage production. A roles and rights system prevents unauthorized access during the exchange of production data.
Intelligent networked machines improve the process chain and products in manufacturing companies. The Data Intelligence Hub makes Industry 4.0 possible.
With smart monitoring, service technicians can easily monitor machines and production processes and react earlier – the Data intelligence Hub offers such an analytics service.
In automated property operation, data from a variety of parties, sources formats flow together. A unified system for the secure and simple exchange of data comes at the right time.
Intelligent waste management means, waste bins equipped with sensors send their filling-level and geographical location. With this data, the ideal time for collection is calculated.
Stakeholders in the logistics industry can plan more efficiently with the Data Intelligence Hub. Thanks to data analytics, they are always up to date.
Numerous data sources and algorithms are combined in the Data Intelligence Hub for a comprehensive inventory and forward-looking forecasts.
With the Telekom Data Intelligence Hub, municipalities can process their data in a targeted manner and make it available to citizens – without having to build their own infrastructure.
Easily generate value from your own and external data: Use the large number of different analysis tools the Data Intelligence Hub offers to process data and generate insights.
With the Cloudera Data Science Workbench, data scientists can manage their own analysis pipelines. This includes integrated planning, monitoring and email alerting. Quickly develop and prototype new machine learning projects and easily deploy them in production.
Azure Databricks is a fast, easy and collaboration-oriented analysis service based on Apache Spark. Create your big data pipelines in just a few steps based on the data collected by data lakes or streaming services. One-click setup, streamlined workflows, and an interactive workspace for data specialists, data engineers, and business analysts to collaborate.
Grafana Labs' free-to-use dashboarding tool Grafana helps you keep track of everything. It is very common because it is very mature and stable.
H2O.ai's H2O open source service is a Java backend for machine learning applications. It comes either for beginners with a pre-set frontend or can be controlled via APIs using programming languages, such as Python, R or Java. It brings some of their own implementations of popular ML algorithms that are among the best in the market.
Organizations are now tapping data science and artificial intelligence (AI) as a technology-enabled business strategy. Experimentation is accelerating across multiple clouds. IBM Watson Studio is a leading data science and machine learning platform built from the ground up for an AI-powered business. It helps enterprises simplify the process of experimentation to deployment, speed data exploration and model development and training, and scale data science operations across the lifecycle. IBM Watson Studio empowers organizations to tap into data assets and inject predictions into business processes and modern applications and then optimize business value with visual data science and decision optimization. It's suited for hybrid multicloud environments that demand mission-critical performance, security and governance —in public clouds, in private clouds, on-premises and on the desktop.
This NumFocus open source workbench is similar to easy-to-use text document processing software. Unlike the default editor, it provides the user with meaningful and efficient features to easily access data and other resources.
RStudio is an integrated development environment (IDE for short) that allows you to analyze data in the R programming language. The workbench includes a console, a syntax highlighting editor that supports direct code execution, and tools for plotting, history, debugging, and workspace management.
Developed by the Apache open source community, this tool helps with data acquisition, recognition, analysis and visualization. The interpreters allow you to attach code in any language to Zeppelin with support for Apache Spark, R, Hive, Shell, Cassandra, and more.
Benefit from the network of Deutsche Telekom: Forge alliances, meet with customers and service providers of all sizes and industries – for more joint success.
GAIA-X aims at building a unified European Data-Infrastructure which acts as a cross border Ecosystem. GAIA-X.NRW aims at enabling small and middle-sized companies to participate in the data boom and to reap benefits by being able to use data provided by large companies. Specific use cases are being realized some of which include mobility, logistics, industry 4.0, production and the energy sector.
The European Funding Project FENIX connects different pilot regions and transport corridors, to do so the project relies on the know-how of T-Systems. An important building block to enable this connection is the Telekom Data Intelligence Hub, it allows a secure, encrypted cross border data exchange. As a neutral participant it connects the diverse logistical IT-Systems on one unified platform.
Manufacturers need punctual delivery of their goods and a successful placement of their promotional products on the sales floor. In order to offer companies a more efficient, effective and cost-effective way of carrying out promotions CHEP has created a new service in collaboration with Deutsche Telekom as its technology partner. The service is based around a long life low-cost-tracker of which the basis has been developed by Deutsche Telekom
We end this year with a handful of Marketplace and Workspace improvements. We introduced a new tool, JupyterLab, and improved the tools page overall in many aspects. Furthermore we fixed several bugs and simplified working with data sources and files.
The FAZ has received a new draft law for the sovereign handling and exchange of industrial data within the EU. The motivation behind the new edition is based on the endeavor to enforce valid European laws within data security and on the other hand it is based on the long-term goal to create an independent and innovation-promoting European data space.
Want to know how to successfully share, analyze and use data in the Data Intelligence Hub to gain business insights? Register for free and start now…