For whom is this course. This 3 credit course is actually
one of the sections of the course Large Scale Data Management of
the Master of Science in Engineering in Computer Science the Sapienza
Università di Roma.
Prerequisites. A good knowledge of the fundamentals of
Programming Structures, Programming Languages, Databases (SQL,
relational data model, Entity-Relationship data model, conceptual and
logical database design) and Database systems.
Course goals. In one sentence, Big Data is data that exceeds the
processing capacity of conventional database systems. In particular,
Big Data applications deal with huge amounts of data, possibly
collected from a huge number of data sources (volume), with
highly heterogeneous format (variety), at a very high rate (velocity).
This scenario calls for new technologies to be developed, ranging from
new data storage mechanisms to new computing frameworks. In this course
we will look at several key technologies used in manipulating, storing,
and analyzing big data. In particular, we will study architectures for
data intensive distributed applications, Data Warehouse solutions,
NoSQL storage solutions, including RDF and graph databases.
Lectures
- When: during the second semester (February 20 - June
2, 2017), every Wednesday at 9.00-11.00
- Where: via Ariosto 25, Roma - classroom A2
Schedule
- Lecture 1 (March 1)
- Course Introduction; Introduction to Big Data
- Lecture 2 (March 8)
- DataWarehousing: Introduction; Architectures
- Lecture 3 (March 15)
- DataWarehousing: ETL; multidimensional model; ROLAP vs. MOLAP; Design Methodology;
- Lecture 4 (March 22)
- DataWarehousing: Conceptual Modeling for DW: Dimensional Fact Model (DFM); Logical Modeling through ROLAP approach: The star schema.
- Lecture 5 (April 5)
- DataWarehousing: ROLAP: star schema and snowflake schema; Logical design.
Introduction to Graph databases
- Lecture 6 (April 12)
- Graph Databases: Graph DBs vs relational DBs; Implementation of Graphs; Querying Graph DBs; Types of Graph DBs; Resource Description Framework
- Lecture 7 (April 19)
- Graph Databases: RDFS; SPARQL; Linked Open Data
- Lecture 8 (April 26)
- Aggregate Databases: introduction to aggregate DBs; NoSQL data models: Key-value, document, column-family databases
- Lecture 9 (May 3)
- Aggregate Databases: Data Modeling; Distribution Models; Consistency: Update consistency
- Lecture 10 (May 10)
- Aggregate Databases: Consistency: Read Consistency, CAP Theorem
Slides
Slides are available at http://elearning2.uniroma1.it/
To access the material enter in the system with your INFOSTUD
account and select the course on Big Data Management
Exams
There are two modalities for the exam:
(1) Development of a small project. Students are strongly encouraged to propose their own idea for projects. As a suggestion, they can refer to (and also select from) the following list of tools. The project connected to a tool consists, for example, in studying the logical data model(s) adopted by the tool, the native storage data structure it uses, the query language it provides, and highlighting further distinguishing features. Also, a demonstration of the basic use of the tool through one or more examples is required. Presentation connected to projects (possibly through slides) should last around 20 minutes (including the demo).
- key-value database tools
- Riak
-
Redis
-
MemcachedDB
-
Voldemort
- document database tools
- MongoDB
-
Couchbase
-
MarkLogic (Enterprise NoSQL)
- column-family database tools
- Cassandra
-
Hbase
-
Hypertable
- DataWarehousing tools
- Hive
- Qlikview (a
proprietary front-end tool for Business intelligence. A personal
edition can be downloaded for study purposes. Being it a front-end
tool, the focus of student analysis should be on the mechanisms
provided by the tool for data analytics, and for multidimensional
access to data, rather than on data models or storage data structure).
Note: This kind of projects can be developed individually or
by groups of two students. In this latter case,
presentation should be equally separated into two parts, one managed by
each member of the group. In this case, the overall presentation time
can be extended to 30-40 minutes.
The exam will consist in the project presentation with possible additional questions on the
topics covered by this
section of the Large Scale Data Management Exam.
To have a project assigned, students must send an email to
lembo@diag.uniroma1.it
indicating the kind project they are willing
to present (please, do not start working to a project before you have
it assigned).
(2) Article Presentation
Article presentation consists in preparing a 20 minute presentation about
scientific papers assigned by the lecturer or proposed by students. Send an email to lembo@diag.uniroma1.it to ask for the assignment of papers to study as final work (please, do not start working to a paper before you have
it assigned).
Note: Article presentation can be carried out only individually
Note: Both project and paper presentations and paper will be preferably
carried out during the office ours. Students
are however required to send an email in advance to fix the exact date
and hour of their presentation.
Note: We recall that these exam details refer only to the
section on Big Data Management of the course "Big Data Management". Once you have passed the exam of this section, it will be
notified to Prof. Maurizio Lenzerini, which is the responsible for the
course for this academic year. The exam of the overall course of
"Large Scale Data Management" will be officially recorded
(verbalizzato) through the INFOSTUD system only once the student will
have successfully passed the exams of all the sections of the course. For details
on this final registration please refere to the web
page of the course "Large Scale Data Management".