Get Instant Help From 5000+ Experts For
question

Writing: Get your essay and assignment written from scratch by PhD expert

Rewriting: Paraphrase or rewrite your friend's essay with similar meaning at reduced cost

Editing:Proofread your work by experts and improve grade at Lowest cost

And Improve Your Grades
myassignmenthelp.com
loader
Phone no. Missing!

Enter phone no. to receive critical updates and urgent messages !

Attach file

Error goes here

Files Missing!

Please upload all relevant files for quick & complete assistance.

Guaranteed Higher Grade!
Free Quote
wave
Assignment on Understanding and Designing Schemas for Data Storing and Manipulation in SQL and Mongo

Part 1 – Understanding the data

In this assignment, you are given a set of data encoded in some file format. The objectives are:
1) Understand the data using the file descriptions and the knowledge graph in this document.

2) Design the schemas for data storing and manipulation in two database management systems (DBMS): Oracle SQL and MongoDB (NoSQL).
3) Insert the data into the DBMSs.
4) Write queries to answer specific questions.
5) Observe the query execution times in the two DBMS environments.
Knowledge Graph
The graph helps you understand the knowledge that can be extracted from the data. For example, an instance of the "Country" has a "capital," which is an instance of a "City." Open the files country.json and the city.json and understand how this information is encoded in the files. Use the information provided for those files in the Section "Data files." Do the same for all other graph connections. Once you have finished this exploration, you should probably find yourself familiar with the dataset Data files
The dataset provided to you corresponds to realistic geographical data. All the data are provided to you encoded in JSON files, and they have two parts: the "column" part and the "items" part. The "column" part has the column names and the data types. For example, city.json has a column "province" of type "varchar2."

It is clear that every data file in the set has a specific purpose and provides data for different parts of the knowledge graph.
Part 1 – Understanding the data
Use the information provided in the introduction section and try to understand the data. For each file, check the data for each of the columns.
1-1 Is each column present in every data entry? List which columns do not appear in every data entry.
1-2 Given the file descriptions and the knowledge graph, are there any redundant columns in the files? List which files have redundant columns. Name the columns and provide reasons why, in your opinion, those columns are redundant?
1-3 What logical constraints would you set for which columns in the files? For example, latitude should always be in the range of [-90, 90], and the longitude should be in [-180, 180]. Here you don't have to set constraints for every column of every file.
Part 2 – Relational database schema
Use the Section "Introduction" information and your analysis in Part 1 to create a relational database schema for Oracle DBMS.
2-1 Write an SQL script that contains the schema for the storage and manipulation of the data provided in the JSON files. Requirements:
a. The database needs to be in 3 NF. This means that you need to have one-to-many relationships between any two related tables. What is more, any data redundancies need to be removed.
b. Define the primary and foreign keys.
c. Define the CHECK constraints you came up with for the appropriate columns (question 1-3).
2-2 Compile the entity-relationship diagram of your database. You can do that after you have executed your schema script in SQL Developer. Refer to the handout of the first TA lecture to remember how one can compile this diagram.

Part 3 – Inserting the data to Oracle DBMS
In many real-life scenarios across different computer science domains, data may not be provided to you in the form you need them. For example, in project 1, you were provided with the data in the form of INSERT statements, so you didn't have to perform any action but execute the script. In the machine learning domain, the data usually need to be scaled and cleaned up before fitting them into the machine learning module. 

support
Whatsapp
callback
sales
sales chat
Whatsapp
callback
sales chat
close