
The theoretical and practical mix of the Post Graduate Diploma in Big Data Analytics (PG-DBDA) programme has the following focus:
- To explore the fundamental concepts of big data analytics with in-depth knowledge and understanding of the big data analytics domain
- To understand the various search methods and visualization techniques and to use various techniques for mining data stream
- To analyze and solve problems conceptually and practically from diverse industries, such as government manufacturing, retail, education, banking/ finance, healthcare and pharmaceutical
- To undertake consulting and industrial projects with significant data analysis component for better understanding of the theoretical concepts from statistics, economics and building future solutions data analytics to make an impact in the technological advancement
- To use advanced analytical tools/ decision-making tools/ operation research techniques to analyze the complex problems and get ready to develop such new techniques for the future
- To learn Cloud Computing, accessing resources and services needed to perform functions with dynamically changing needs
The educational criteria for PG-DBDA course is
- Graduate in Engineering (10+2+4 or 10+3+3 years) in IT / Computer Science / Electronics / Telecommunications / Electrical / Instrumentation. OR
- MSc/MS (10+2+3+2 years) in Computer Science, IT, Electronics. OR
- Graduate in any discipline of Engineering, OR
- Post Graduate Degree in Management with corresponding basic degree in Computer Science, IT, Computer Application OR
- Post Graduate Degree in Mathematics / Statistics / Physics OR
- MCA, MCM
- The candidates must have secured a minimum of 55% marks in their qualifying examination.
The Post Graduate Diploma in Big Data Analytics (PG-DBDA) course will be delivered in fully ONLINE or fully PHYSICAL mode. The total course fee and payment details for the fully PHYSICAL or fully ONLINE mode of delivery is as detailed herein below:
1. PHYSICAL Mode of Delivery:
The course fee for the fully PHYSICAL mode of delivery is INR. 1,15,000/- plus Goods and Service Tax (GST) as applicable by Government of India (GOI).
The course fee for PG-DBDA has to be paid in two installments as per the schedule.
- First installment is INR. 10,000/- plus Goods and Service Tax (GST) as applicable by GOI.
- Second installment is INR. 1,05,000/- plus Goods and Service Tax (GST) as applicable by GOI.
2. ONLINE Mode of Delivery:
The course fee of the fully ONLINE mode of delivery is INR. 97,750/- plus Goods and Service Tax (GST) as applicable by GOI.
The course fee for PG-DBDA has to be paid in two installments as per the schedule.
- First installment is INR. 10,000/- plus Goods and Service Tax (GST) as applicable by GOI.
- Second installment is INR. 87,750/- plus Goods and Service Tax (GST) as applicable by GOI.
The course fee includes expenses towards delivering classes, conducting examinations, final mark-list and certificate, and placement assistance provided.
The first installment course fee of Rs 10,000/- + GST on it as applicable at the time of payment is to be paid online as per the schedule. It can be paid using credit/debit cards through the payment gateway. The first installment of the course fees is to be paid after seat is allocated during counseling rounds.
The second installment of the course fees is to be paid before the course commencement through NEFT.
NOTE: Candidates may take note that no Demand Draft (DD) or cheque or cash will be accepted at any C-DAC training centre towards payment of any installment of course fees.
Linux Programming and Cloud Computing: Installation(Ubuntu and CentOS), Basics of Linux, Configuring Linux, Shells, Commands, and Navigation, Common Text Editors, Administering Linux, Introduction to Users and Groups, Linux shell scripting, shell computing, Introduction to enterprise computing, Remote access.
Introduction to Git/GitHub: Introduction to version control systems, creating GitHub repository, using Git - Introduction to get commands.
Introduction to Cloud Computing: Cloud Computing Basics, Understanding Cloud Vendors (AWS:EC2 instance, lambda and Heroku: Heroku platform, Heroku Data services), Definition, Characteristics, Components, Cloud provider, SAAS, PAAS, IAAS and other Organizational scenarios of clouds, Administering & Monitoring cloud services, benefits and limitations, Deploy application over cloud. Comparison among SAAS, PAAS, IAAS, Cloud Products and Solutions, Cloud Pricing, Compute Products and Services, Elastic Cloud Compute, Dashboard.
Python Programming: Python basics, If, If- else, Nested if-else, Looping, For, While, Nested loops, Control Structure, Break, Continue, Pass, Strings and Tuples, Accessing Strings, Basic Operations, String slices, Working with Lists, Accessing list, Operations, Function and Methods, Files, Modules, Dictionaries, Functions and Functional Programming, Declaring and calling Functions, Declare, assign and retrieve values from Lists, Introducing Tuples, Accessing tuples, Visualizing using Matplotlib, Seaborn, OOPs concept, Class and object, Attributes, Inheritance, Overloading, Overriding, Data hiding, Operations Exception, Exception Handling, except clause, Try-finally clause, User Defined Exceptions, Data wrangling, Data cleaning, Load images and audio files using python libraries(pillow/scikit-learn), Creation of python virtual environment, Logging in Python.
R Programming: Reading and Getting Data into R, Exporting Data from R, Data Objects-Data Types & Data Structure. Viewing Named Objects, Structure of Data Items, Manipulating and Processing Data in R (Creating, Accessing, Sorting data frames, Extracting, Combining, Merging, reshaping data frames), Control Structures, Functions in R (numeric, character, statistical), working with objects, Viewing Objects within Objects, Constructing Data Objects, Packages – Tidyverse, Dplyr, Tidyr etc., Queuing Theory, Case Study
Introduction to Business Analytics using some case studies, Summary Statistics, Making Right Business Decisions based on data, Statistical Concepts, Descriptive Statistics and its measures, Probability theory, Probability Distributions (Continuous and discrete- Normal, Binomial and Poisson distribution) and Data, Sampling and Estimation, Statistical Interfaces, Predictive modeling and analysis, Bayes’ Theorem, Central Limit theorem, Statistical Inference Terminology (types of errors, tails of test, confidence intervals etc.), Hypothesis Testing, Parametric Tests: ANOVA, t-test, Non parametric Tests- chi-Square, U-Test, Data Exploration & preparation, Concepts of Correlation, Covariance, Outliers, Simulation and Risk Analysis, Optimization, Linear, Integer, Overview of Factor Analysis, Directional Data Analytics, Functional Data Analysis , Predictive Modelling (From Correlation To Supervised Segmentation): Identifying Informative Attributes, Segmenting Data By Progressive Attributive, Models, Induction And Prediction, Supervised Segmentation, Visualizing Segmentations, Trees As Set Of Rules, Probability Estimation; Decision Analytics: Evaluating Classifiers, Analytical Framework, Evaluation, Baseline, Performance And Implications For Investments In Data; Evidence And Probabilities: Explicit Evidence Combination With Bayes Rule, Probabilistic Reasoning; Business Strategy: Achieving Competitive Advantages, Sustaining Competitive Advantages
Python Libraries – Pandas, Numpy, Scrapy, Plotly, Beautiful soup
Introduction to Big Data-Big Data - Beyond the Hype, Big Data Skills and Sources of Big Data, Big Data Adoption, Research and Changing Nature of Data Repositories, Data Sharing and Reuse Practices and Their Implications for Repository Data Curation
Hadoop: Introduction of Big data programming-Hadoop, The ecosystem and stack, The Hadoop Distributed File System (HDFS), Components of Hadoop, Design of HDFS, Java interfaces to HDFS, Architecture overview, Development Environment, Hadoop distribution and basic commands, Eclipse development, The HDFS command line and web interfaces, The HDFS Java API (lab), Analyzing the Data with Hadoop, Scaling Out, Hadoop event stream processing, complex event processing, MapReduce Introduction, Developing a Map Reduce Application, How Map Reduce Works, The MapReduce Anatomy of a Map Reduce Job run, Failures, Job Scheduling, Shuffle and Sort, Task execution, Map Reduce Types and Formats, Map Reduce Features, Real-World MapReduce
Hadoop Environment: Setting up a Hadoop Cluster, Cluster specification, Cluster Setup and Installation, Hadoop Configuration, Security in Hadoop, Administering Hadoop, HDFS – Monitoring & Maintenance, Hadoop benchmarks
Apache Airflow: Introduction to Data warehousing and Data lakes, Designing Data warehousing for an ETL Data Pipeline, Designing Data Lakes for ETL Data Pipeline, ETL vs ELT
Introduction to HIVE, Programming with Hive: Data warehouse system for Hadoop, Optimizing with Combiners and Practitioners (lab), Bucketing, more common algorithms: sorting, indexing and searching (lab), Relational manipulation: map-side and reduce-side joins (lab), evolution, purpose and use, Case Studies on Ingestion and warehousing
HBase: Overview, comparison and architecture, java client API, CRUD operations and security
Apache Spark: APIs for large-scale data processing: Overview, Linking with Spark, Initializing Spark, Resilient Distributed Datasets (RDDs), External Datasets, RDD Operations, Passing Functions to Spark, Job optimization, Working with Key-Value Pairs, Shuffle operations, RDD Persistence, Removing Data, Shared Variables, EDA using PySpark, Deploying to a Cluster Spark Streaming, Spark MLlib and ML APIs, Spark Data Frames/Spark SQL, Integration of Spark and Kafka, Setting up Kafka Producer and Consumer, Kafka Connect API, Mapreduce, Connecting DB’s with Spark
Clustering and filtering approach in big data using Machine Learning Models, Energy efficient in big data gathering, Dynamic Big Data Storage on Cloud & Fine-Grained Updates.
After completing this course students will be trained in statistics and machine learning using Python. They will make data driven decisions which provide them a competitive advantage in the market, technologies like Hadoop, Spark, Hive, Machine Learning provides a spring board for AI which makes them ready for Industry 4.0. At the end of the course students will be able to work as Data Analysts, Data Engineers. Studying Big Data will broaden their horizon by surpassing market forecast / predictions for Big Data Analytics.
Karnataka 560100
Tamilnadu 600113
Andhra Pradesh 500016
West Bengal 700091
Maharashtra 400614
Maharashtra 400049
New Delhi 110025
Uttar Pradesh 201307
Maharashtra 411008
Maharashtra 411044
Maharashtra 411004
Maharashtra 411057
Assam 788010
Kerala 695581
Q. What is the Eligibility for PG-Diploma in Big Data Analytics?
A. The eligibility Criteria for PG-DBDA is Candidate holding any one of the following degrees
Graduate in Engineering (10+2+4 or 10+3+3 years) in IT / Computer Science / Electronics / Telecommunications / Electrical / Instrumentation. OR
- MSc/MS (10+2+3+2 years) in Computer Science, IT, Electronics with Mathematics in 10+2.OR
- Graduate in any discipline of Engineering, OR
- Post Graduate Degree in Engineering Sciences with corresponding basic degree (e.g. MSc in Computer Science, IT, Electronics) OR
- Post Graduate Degree in Mathematics / Statistics / Physics / MBA Systems, OR
- MCA
- The candidates must have secured a minimum of 55% marks in their qualifying examination
A. The selection process consists of a C-DAC Common Admission Test (C-CAT).
Q. What is Fee of course?
A. The fees for the PG-DBDA course is Rs. 97,750/- (Rupees Ninty Seven thousand Seven hundred and Fifty only) plus 18 % GST for online mode and Rs.1,15,000/- (Rupees One Lakh Fifteen Thousand Only) plus 18 % GST for physical mode of delivery.
Q. When the course does commence?
A. The Course commences twice in the year i.e. March & September. Admission Process starts in the month of December & July for the respective batches.
Q. Duration of the course?
A. It’s 24 weeks approximately full-time course with 900 hours of Theory + Practical + Project work.
A. Fully equipped classrooms with adequate capacity to accommodate students and
Q. Hostel & Canteen facility available?
A. Accommodation for out station candidates is facilitated by some of centers. Please
Q. Revision of the course contents, is it every six months?
A. The course contents are revised according to the real world needs and when found
Q. Do you have centralized placement cell?
A. Yes. We do have a Centralized Placement Programme where the respective center
Q. What is the value of the course in the international market?
A. The course has been a trend-setting course due to its unique curriculum and the