Page Content
Berlin Big Data Center (BBDC)
The Berlin Big Data Center (BBDC)
[1] develops highly innovative technologies to organize the vast
amounts of data and to derive informed decisions from these in order
to create economic and social value. This is achieved through the
merger of the previously isolated disciplines of data management and
machine learning. The technologies of the center reduce the cost of
analysis of Big Data, increase the group of people who can perform
large scale data analysis and expand the leading position of Germany
in this field in science and industry. The focus is on three exemplary
economically, scientifically and socially relevant application areas:
materials science, medicine, and information marketplaces. Based on
worldwide recognized leading-edge research we want to enable automatic
optimization, parallelization, and scalable and adaptive processing
algorithms. This covers work in the areas of machine learning, linear
algebra, statistics, probability theory, computer linguistics and
signal processing.
In order to optimally prepare industry,
science and the society in Germany and Europe for the global Big Data
trend, highly coordinated activities in research, teaching, and
technology transfer regarding the integration of data analysis methods
and scalable data processing are required. To achieve this, the Berlin
Big Data Center is pursuing the following seven
objectives:
- Pooling expertise in scalable data management, data analytics, and big data application
- Conducting fundamental research to develop novel and automatically scalable technologies capable of performing “Deep Analysis” of “Big Data”.
- Developing an integrated, declarative, highly scalable open-source system that enables the specification, automatic optimization, parallelization and hardware adaptation, and fault-tolerant, efficient execution of advanced data analysis problems, using varying methods (e.g., drawn from machine learning, linear algebra, statistics and probability theory, computational linguistics, or signal processing), leveraging our work on Apache Flink [2]
- Transferring technology and know-how to support innovation in companies and startups.
- Educating data scientists with respect to the five big data dimensions (i.e., applications, economic, legal, social, and technological) via leading educational programs.
- Empowering people to leverage “Smart Data”, i.e., to discover newfound information based on their massive data sets.
- Enabling the general public to conduct sound data-driven decision-making.
Consortium
TU Berlin:
- Database Systems and Information Management (DIMA)
- Machine Learning (ML)
- Internet Network Architectures (INET)
- Complex and Distributed IT Systems (CIT)
- Image Communication (IC)
DFKI
- Language Technology Lab (LT)
- Intelligent Analysis of Mass Data Lab (IAM)
Zuzue Insitute Berlin
- Distributed Algorithms and Supercomputing (DAS)
- Mathematics for Life and Materials Sciences (MfLMS)
Beuth Hochschule
- Data Science Lab (DSL)
Fritz Haber Institute of the Max Planck Society
- Theory Department (MPI/FHI)
Further project information
[3]
- © Copyright??
- Funding: Federal Ministry of Education and Research
- Funding mark: 01IS14013A
- Website: www.bbdc.berlin [4]
Contact
Sasho Nedelkoski+49 30 314-24813
Room TEL 1209
e-mail query [5]
Contact
Lauritz Thamsen+49 30 314-24539
TEL
Room 1210
e-mail query [6]
c_logo_01.png
parameter/en/minhilfe/id/154442/?no_cache=1&ask_mai
l=YsB%2BnAAMZKEjK3eaZwTvVEQucSjEFzYHGZU00%2FFjnBg%3D&am
p;ask_name=Sasho%20Nedelkoski
parameter/en/minhilfe/id/154442/?no_cache=1&ask_mai
l=YsB%2BnAAMl8iiueeFjAdcMTah67VlCgJn53iI0hT84cCDKHJp6sl
Jyw%3D%3D&ask_name=Lauritz%20Thamsen