Curriculum
- 52 Sections
- 465 Lessons
- 24 Weeks
Expand all sectionsCollapse all sections
- About the Course/*! CSS Used from: Embedded */ *, ::after, ::before { box-sizing: border-box; border-width: 0; border-style: solid; border-color: #e5e7eb; } ::after, ::before { --tw-content: ''; } h2 { font-size: inherit; font-weight: inherit; } a { color: inherit; text-decoration: inherit; } h2, p { margin: 0; } :disabled { cursor: default; } *, ::before, ::after { --tw-border-spacing-x: 0; --tw-border-spacing-y: 0; --tw-translate-x: 0; --tw-translate-y: 0; --tw-rotate: 0; --tw-skew-x: 0; --tw-skew-y: 0; --tw-scale-x: 1; --tw-scale-y: 1; --tw-scroll-snap-strictness: proximity; --tw-ring-offset-width: 0px; --tw-ring-offset-color: #fff; --tw-ring-color: rgb(59 130 246 / 0.5); --tw-ring-offset-shadow: 0 0 #0000; --tw-ring-shadow: 0 0 #0000; --tw-shadow: 0 0 #0000; --tw-shadow-colored: 0 0 #0000; } .mx-auto { margin-left: auto; margin-right: auto; } .mb-2 { margin-bottom: 0.5rem; } .mb-4 { margin-bottom: 1rem; } .mb-6 { margin-bottom: 1.5rem; } .mr-2 { margin-right: 0.5rem; } .max-w-screen-sm { max-width: 640px; } .max-w-screen-xl { max-width: 1280px; } .rounded-lg { border-radius: 0.5rem; } .bg-primary-700 { --tw-bg-opacity: 1; background-color: rgb(29 78 216 / var(--tw-bg-opacity)); } .bg-white { --tw-bg-opacity: 1; background-color: rgb(255 255 255 / var(--tw-bg-opacity)); } .px-4 { padding-left: 1rem; padding-right: 1rem; } .px-5 { padding-left: 1.25rem; padding-right: 1.25rem; } .py-2.5 { padding-top: 0.625rem; padding-bottom: 0.625rem; } .py-8 { padding-top: 2rem; padding-bottom: 2rem; } .text-center { text-align: center; } .text-4xl { font-size: 3rem; line-height: 2.5rem; } .text-sm { font-size: 0.875rem; line-height: 1.25rem; } .font-extrabold { font-weight: 800; } .font-light { font-weight: 300; } .font-medium { font-weight: 500; } .leading-tight { line-height: 1.25; } .tracking-tight { letter-spacing: -0.025em; } .text-gray-500 { --tw-text-opacity: 1; color: rgb(107 114 128 / var(--tw-text-opacity)); } .text-gray-900 { --tw-text-opacity: 1; color: rgb(17 24 39 / var(--tw-text-opacity)); } .text-white { --tw-text-opacity: 1; color: rgb(255 255 255 / var(--tw-text-opacity)); } .hover:bg-primary-800:hover { --tw-bg-opacity: 1; background-color: rgb(30 64 175 / var(--tw-bg-opacity)); } .focus:outline-none:focus { outline: 2px solid transparent; outline-offset: 2px; } .focus:ring-4:focus { --tw-ring-offset-shadow: var(--tw-ring-inset) 0 0 0 var(--tw-ring-offset-width) var(--tw-ring-offset-color); --tw-ring-shadow: var(--tw-ring-inset) 0 0 0 calc(4px + var(--tw-ring-offset-width)) var(--tw-ring-color); box-shadow: var(--tw-ring-offset-shadow), var(--tw-ring-shadow), var(--tw-shadow, 0 0 #0000); } .focus:ring-primary-300:focus { --tw-ring-opacity: 1; --tw-ring-color: rgb(147 197 253 / var(--tw-ring-opacity)); } @media (min-width: 640px) { .sm:py-16 { padding-top: 4rem; padding-bottom: 4rem; } } @media (min-width: 768px) { .md:text-lg { font-size: 1.5rem; line-height: 1.75rem; } } @media (min-width: 1024px) { .lg:px-6 { padding-left: 1.5rem; padding-right: 1.5rem; } } .imgdata { width: 35% } @media (max-width: 767px) { .imgdata { width: 40% } } .course-payment{ display: none !important; } .thim-course-landing-button { display: none !important; }
CDAC CCAT Study Materials
Crack your CDAC CCAT Exams in first attempt with LMT Study Materials- 📚Complete Study Notes (Concepts & Solved Example)
- 📌 500+ Practice Problems
- 📝 Formula's and tricks
- 👣 Beginner Friendly
- ⭐️ Trusted by 1000+ Students
- 📑 Previous Year Question Papers
- 📚 Section A + Section B + Section C [ All Notes Available ]
The one-stop destination for everything you need for CDAC CCAT Exam Preparations
So let's dive in - Enroll today, Crack the exam & get into top Institute of CDAC and get to work with your Dream Company0 - Section A [ Quantitative ] - Profit and Loss6
- Section A [ Quantitative ] - Average4
- Section A [ Quantitative ] - Arithmetic Progression and Geometric Progression5
- Section A [ Quantitative ] - Ages Problems6
- Section A [ Quantitative ] - Alligations and Mixtures5
- Section A [ Quantitative ] - Boats and Streams5
- Section A [ Quantitative ] - Chain rule4
- Section A [ Quantitative ] - HCF and LCM6
- Section A [ Quantitative ] - Number System10
- 10.1Number System Part #113 Minutes
- 10.2Number System Part #212 Minutes
- 10.3Number System Part #37 Minutes
- 10.4Number System Part #410 Minutes
- 10.5Number System Part #510 Minutes
- 10.6Number System Part #610 Minutes
- 10.7Number System Part #77 Minutes
- 10.8Number System Part #87 Minutes
- 10.9Number System Part #99 Minutes
- 10.10Number System Part #109 Minutes
- Section A [ Quantitative ] - Percentage Problem8
- Section A [ Quantitative ] - Permutation and Combination6
- Section A [ Quantitative ] - Pipes and Cistern5
- Section A [ Quantitative ] - Probability4
- Section A [ Quantitative ] - Ratio and Proportion7
- Section A [ Quantitative ] - Simple and Compound Interest8
- 16.1Simple and Compound Interest Part #19 Minutes
- 16.2Simple and Compound Interest Part #213 Minutes
- 16.3Simple and Compound Interest Part #38 Minutes
- 16.4Simple and Compound Interest Part #414 Minutes
- 16.5Fastest Method to Solve Compound Interest7 Minutes
- 16.6Simple Interest and compound [Formula]
- 16.7Simple and Compound Interest [Notes]
- 16.8Simple and Compound Interest [Practice Problems]
- Section A [ Quantitative ] - Time and Work6
- Section A [ Quantitative ] - Train Problems6
- Section A [ Verbal ] - Roots words2
- Section A [ Verbal ] - Reading Comprehension4
- Section A [ Verbal ] - Subject Verb Agreement2
- Section A [ Verbal ] - Synonyms And Antonyms4
- Section A [ Verbal ] - Ability Extra9
- 23.1Introduction to Verbal Ability + Tenses Part #112 Minutes
- 23.2Tenses Part #111 Minutes
- 23.3Tenses [ Notes + Solved Examples + Practice Problem ]
- 23.4Error Detection and Correction + Sentence Completion #114 Minutes
- 23.5Sentence Completion Part #213 Minutes
- 23.6Spotting Erorrs [ Notes + Solved Examples + Practice Problem ]
- 23.7Sentence Completion [ Notes + Solved Examples + Practice Problem ]
- 23.8Assumption in Verbal Reasoning14 Minutes
- 23.9Deduction in Verbal Reasoning26 Minutes
- Section A [ Reasoning ] - Blood Relation6
- Section A [ Reasoning ] - Calender4
- Section A [ Reasoning ] - Coding Decoding7
- Section A [ Reasoning ] - Data Sufficiency3
- Section A [ Reasoning ] - Clock Problems6
- Section A [ Reasoning ] - Direction8
- Section A [ Reasoning ] - Number Series6
- Section A [ Reasoning ] - Seating Arrangement8
- 31.1Seating Arrangement Part #18 Minutes
- 31.2Seating Arrangement Part #212 Minutes
- 31.3Seating Arrangement Part #311 Minutes
- 31.4Seating Arrangement Part #46 Minutes
- 31.5Seating Arrangement Part #57 Minutes
- 31.6Seating Arrangement Part #66 Minutes
- 31.7Seating Arrangement [Notes]
- 31.8Seating Arrangement [ Practice Problem ]
- Section A [ Reasoning ] - Visual Reasoning6
- Section A - Computer Fundamental12
- 33.11. Introduction10 Minutes
- 33.22. Computer Memory and its types14 Minutes
- 33.33. Motherboard and its component8 Minutes
- 33.44. Ports, Cables and Graphic Card9 Minutes
- 33.55. Storage Devices7 Minutes
- 33.66. Number System11 Minutes
- 33.77. Machine Language6 Minutes
- 33.88. Operating Systems10 Minutes
- 33.99. AR and VR7 Minutes
- 33.1010. Computer Virus8 Minutes
- 33.11Computer Fundamental [Notes]
- 33.12Computer Fundamental Practice Problem
- Section B - [ C Programming ]32
- 34.1C Programming Part #110 Minutes
- 34.2C Programming Part #212 Minutes
- 34.3C Programming Part #316 Minutes
- 34.4C Programming Part #421 Minutes
- 34.5C Programming Part #523 Minutes
- 34.6C Programming Part #615 Minutes
- 34.7C Programming Part #715 Minutes
- 34.8C Programming Part #814 Minutes
- 34.9C Programming Part #915 Minutes
- 34.10C Programming Part #1016 Minutes
- 34.11C Programming Part #1112 Minutes
- 34.12C Programming Part #1212 Minutes
- 34.13C Programming Part #1319 Minutes
- 34.14C Programming Part #1413 Minutes
- 34.15C Programming Part #1515 Minutes
- 34.16C Programming Part #1612 Minutes
- 34.17C Programming Part #1713 Minutes
- 34.18C Programming Part #1811 Minutes
- 34.19C Programming Part #1916 Minutes
- 34.20C Programming Part #2013 Minutes
- 34.21C Programming Part #2123 Minutes
- 34.22C Programming Part #2215 Minutes
- 34.23C Programming Part #2314 Minutes
- 34.24C Programming Part #2417 Minutes
- 34.25C Programming Part #2511 Minutes
- 34.26C Programming Part #2613 Minutes
- 34.27C Programming Part #2714 Minutes
- 34.28C Programming Part #2810 Minutes
- 34.29C Programming Part #2910 Minutes
- 34.30C Programming Part #3011 Minutes
- 34.31C Programming [Notes]
- 34.32C Programming Practice Problem
- Section B - [ C Snippet Problems with Solutions Video]10
- 35.1Code Snippet 1 – 522 Minutes
- 35.2Code Snippet 6 -1020 Minutes
- 35.3Code Snippet 11 – 1519 Minutes
- 35.4Code Snippet 16 – 2015 Minutes
- 35.5Code Snippet 21 – 2524 Minutes
- 35.6Code Snippet 26 – 3018 Minutes
- 35.7Code Snippet 31 – 3525 Minutes
- 35.8Code Snippet 36 – 4025 Minutes
- 35.9Code Snippet 41 – 4513 Minutes
- 35.10Code Snippet 46 – 5018 Minutes
- Section B - [ Object Oriented Programming (C++) ]27
- 36.1Object Oriented Programming (C++) Sample
- 36.2Object Oriented Programming (C++)
- 36.3Object Oriented Programming (C++) Practice Problem
- 36.4Introduction to C++11 Minutes
- 36.5Input & Output in C++14 Minutes
- 36.6Difference between Structure in C & C++14 Minutes
- 36.7Class & Objects C++14 Minutes
- 36.8Inspector, Mutators, Facilitators16 Minutes
- 36.9Contructor & Destructor15 Minutes
- 36.10Default Arguments & Inline Function16 Minutes
- 36.11Array of Objects16 Minutes
- 36.12Dynamic Memory Allocation15 Minutes
- 36.13Static Member and function15 Minutes
- 36.14Exception handling18 Minutes
- 36.15Composition22 Minutes
- 36.16Friend Class and function13 Minutes
- 36.17Function Overloading15 Minutes
- 36.18Operator Overloading13 Minutes
- 36.19Copy Constructor5 Minutes
- 36.20Inheritance15 Minutes
- 36.21Pure Virtual Function13 Minutes
- 36.22Types of Inheritance26 Minutes
- 36.23Virtual Function12 Minutes
- 36.24Template15 Minutes
- 36.25RTTI11 Minutes
- 36.26Casting Operators19 Minutes
- 36.27Dynamic Array of Object13 Minutes
- Section B - [ Operating System ]52
- 37.1Introduction to Operating System [ Operating System Overview ]9 Minutes
- 37.2Monolithic and Micro Kernel Architecture [ Operating System Overview ]6 Minutes
- 37.3User Space and Kernel Space [ Operating System Overview ]6 Minutes
- 37.4PCB (Process Control Block) [ Process And Process Scheduling ]6 Minutes
- 37.5Process State Transition Diagram [ Process And Process Scheduling ]11 Minutes
- 37.6FCFS ( First come first serve ) [ Process And Process Scheduling]8 Minutes
- 37.7SJF Preemptive with solved example [ Process And Process Scheduling ]9 Minutes
- 37.8SJF (non preemptive ) with solved [ Process And Process Scheduling ]5 Minutes
- 37.9Preemptive Priority Process Scheduling [ Process And Process Scheduling ]8 Minutes
- 37.10ROUND ROBIN SCHEDULING ALGORITHM with solved example [ Process And Process Scheduling ]12 Minutes
- 37.11Producer Consumer Problem [ Process Synchronization and Deadlock ]8 Minutes
- 37.12Critical Section Problem [ Process Synchronization and Deadlock ]7 Minutes
- 37.13Dead Lock with Necessary and Sufficient Condition for Deadlock [ Process Synchronization and Deadlock ]4 Minutes
- 37.14Bankers algorithm with solve example part 1 [ Process Synchronization and Deadlock ]13 Minutes
- 37.15Bankers (Resource Request algorithm) with solve example part 2 [ Process Synchronization and Deadlock ]9 Minutes
- 37.16Dead Lock Recovery [ Process Synchronization and Deadlock ]5 Minutes
- 37.17Dining philosopher Problem [ Process Synchronization and Deadlock ]8 Minutes
- 37.18Introduction to Memory in Operating system [ Memory Management ]8 Minutes
- 37.19Memory management Part #1 [ Memory Management ]7 Minutes
- 37.20Memory management Part #2 [ Memory Management ]5 Minutes
- 37.21Memory management Part #3 [ Memory Management ]8 Minutes
- 37.22Loading and Linking [ Memory Management ]4 Minutes
- 37.23FIFO, LRU AND OPTIMAL PAGE REPLACEMENT ALGORITHMS [ Memory Management ]20 Minutes
- 37.24Disk scheduling Algorithm [ Input / Output Management ]6 Minutes
- 37.25FCFS Disk Scheduling Algorithm [ Input / Output Management ]13 Minutes
- 37.26SSTF Disk Scheduling Algorithm [ Input / Output Management ]7 Minutes
- 37.27SCAN Disk Scheduling Algorithm [ Input / Output Management ]6 Minutes
- 37.28CSCAN Disk Scheduling Algorithm [ Input / Output Management ]5 Minutes
- 37.29Look Disk Scheduling Algorithm [ Input / Output Management ]4 Minutes
- 37.30CLook Disk Scheduling Algorithm [ Input / Output Management ]4 Minutes
- 37.31FCFS SSTF SCAN CSCAN LOOK CLOOK Overview [ Input / Output Management ]8 Minutes
- 37.32Connection Oriented vs Connection-Less [ OSI Layers ]8 Minutes
- 37.33OSI Reference model [ OSI Layers ]6 Minutes
- 37.34TCP-IP Reference model [ OSI Layers ]6 Minutes
- 37.35OSI vs TCP-IP model comparision [ OSI Layers ]7 Minutes
- 37.36IP address vs Mac address [ IP Addressing ]8 Minutes
- 37.37IPv4 vs IPv6 Comparison [ IP Addressing ]9 Minutes
- 37.38IPv4 header format [ IP Addressing ]12 Minutes
- 37.39Routing Algorithms Part #1 [ IP Addressing ]11 Minutes
- 37.40Routing Algorithms Part #2 [ IP Addressing ]9 Minutes
- 37.41ARP And RARP [ IP Addressing ]7 Minutes
- 37.42Leaky Bucket Algorithm [ IP Addressing ]5 Minutes
- 37.43Token Bucket Algorithm [ IP Addressing ]4 Minutes
- 37.44Domain Name System (DNA) [ Common TCP/IP Stack Protocols ]6 Minutes
- 37.45Hypertext Transfer Protocol – HTTP [ Common TCP/IP Stack Protocols ]8 Minutes
- 37.46Simple Mail Transfer Protocol – SMTP [ Common TCP/IP Stack Protocols ]5 Minutes
- 37.47File Transfer Protocol – FTP [ Common TCP/IP Stack Protocols ]5 Minutes
- 37.48Operating System Sample [Notes]
- 37.49Operating System [Notes]
- 37.50Operating System Practice Problem
- 37.51Computer Network [Notes]
- 37.52Computer Network Practice Problem
- Section B - [ Data Structures ]92
- 38.101 – Introducton [DSA Introduction]8 Minutes
- 38.202 – Types Of Data Structures [DSA Introduction]6 Minutes
- 38.303 – Operations Of Data Structures [DSA Introduction]6 Minutes
- 38.404 – Concept Of Abstract Data Type [DSA Introduction]6 Minutes
- 38.505 – Arrays – Part 1 [DSA Introduction]16 Minutes
- 38.606 – Arrays – Part 2 [DSA Introduction]19 Minutes
- 38.707 – Introduction To Stack [Stacks]4 Minutes
- 38.808 – Operations Of Stack ADT [Stacks]6 Minutes
- 38.909 – Stack Implementation Using Array Part – 1 [Stacks]10 Minutes
- 38.1010 – Stack Implementation Using Array Part – 2 [Stacks]12 Minutes
- 38.1112 – Operations Of Queue ADT [Queues]4 Minutes
- 38.1211 – Introduction Of Queue [Queues]4 Minutes
- 38.1313 – Queue Implementation Using Array Part – 1 [Queues]11 Minutes
- 38.1414 – Queue Implementation Using Array Part-2 [Queues]9 Minutes
- 38.1515 – Circular Queue Implementation Part – 1 [Queues]12 Minutes
- 38.1616 – Circular Queue Implementation Part – 2 [Queues]9 Minutes
- 38.1717 – Introduction To Linked List [Linked Lists]5 Minutes
- 38.1818 – Array Vs Linked List [Linked Lists]7 Minutes
- 38.1919 – Linked List Implementation Part – 1 [Linked Lists]17 Minutes
- 38.2020 – Linked List Implementation Part – 2 [Linked Lists]10 Minutes
- 38.2121 – Linked List Implementation Part – 3 [Linked Lists]10 Minutes
- 38.2222 – Linked List Implementation Part – 4 [Linked Lists]14 Minutes
- 38.2323 – Linked List Implementation Part – 5 [Linked Lists]9 Minutes
- 38.2424 – Linked List Implementation Part – 6 [Linked Lists]10 Minutes
- 38.2525 – Stack Implementation Using Linked List – Part 1 [Linked Lists]9 Minutes
- 38.2626 – Stack Implementation Using Linked List – Part 2 [Linked Lists]6 Minutes
- 38.2727 – Queue Using Linked List Part – 1 [Linked Lists]10 Minutes
- 38.2828 – Queue Using Linked List Part – 2 [Linked Lists]7 Minutes
- 38.2929 – Circular Queue Using Linked List [Linked Lists]10 Minutes
- 38.3030 – Implementation Of Doubly Linked List – Part 1 [Linked Lists]14 Minutes
- 38.3131 – Implementation Of Doubly Linked List – Part 2 [Linked Lists]8 Minutes
- 38.3232 – Implementation Of Doubly Linked List – Part 3 [Linked Lists]10 Minutes
- 38.3333 – Implementation Of Doubly Linked List – Part 4 [Linked Lists]10 Minutes
- 38.3434 – Implementation Of Doubly Linked List – Part 5 [Linked Lists]12 Minutes
- 38.38Data Structures And Algorithms Sample
- 38.39Data Structures And Algorithms
- 38.40Data Structures And Algorithms Practice Problem
- 38.41Binary Tree Implementation Part 1 – Insertion & Traversing [Trees]14 Minutes
- 38.42Introduction to Tree Data Structure & Binary Tree [Trees]6 Minutes
- 38.43Binary Tree Implementation Part 2 – Traversing & Deletion [Trees]12 Minutes
- 38.44Binary tree Traversal [Trees]14 Minutes
- 38.45Tree traversal methods: Pre, post and in-order traversal [Trees]14 Minutes
- 38.46(Binary Search Tree) – Traverse the Tree in In-Order, Pre-Order and Post-Order [Trees]11 Minutes
- 38.47Height Balaced Binary Search Tree12 Minutes
- 38.48AVL Tree [Trees]16 Minutes
- 38.49AVL Tree Solved Example – #1 [Trees]17 Minutes
- 38.50AVL Tree Solved Example – #2 [Trees]17 Minutes
- 38.51B-Tree Introduction [Trees]12 Minutes
- 38.52B-Tree Solved Example – #114 Minutes
- 38.53B-Tree Solved Example – #2 [Trees]12 Minutes
- 38.54Huffman Coding [Trees]7 Minutes
- 38.55BFS & DFS – #1 [Graphs]14 Minutes
- 38.56Topological Sorting [Graphs]11 Minutes
- 38.57Graphs Introduction [Graphs]13 Minutes
- 38.58BFS & DFS – #2 [Graphs]12 Minutes
- 38.59Topological Sorting – Solved Example [Graphs]9 Minutes
- 38.60Linear Search [Recursion and Storage Management]5 Minutes
- 38.61Recursion – Winding & Unwinding [Recursion and Storage Management]12 Minutes
- 38.62Binary Search [Recursion and Storage Management]11 Minutes
- 38.63Introduction to Buddy System [Recursion and Storage Management]10 Minutes
- 38.64Problems on Buddy System [Recursion and Storage Management]8 Minutes
- 38.65Hashing Collision [Searching & Sorting]12 Minutes
- 38.66Hashing concept [Searching & Sorting]6 Minutes
- 38.67Linear Probing – #2 [Searching & Sorting]12 Minutes
- 38.68Linear Probing – #1 [Searching & Sorting]13 Minutes
- 38.69Linear Probing With Collisions – Solved [Searching & Sorting]10 Minutes
- 38.70Linear + Quadratic Probing – Solved Example [Searching & Sorting]15 Minutes
- 38.71Searching & Sorting Introduction [Searching & Sorting]12 Minutes
- 38.72Selection Sorting – #2 [Searching & Sorting]12 Minutes
- 38.73Selection Sorting – #1 [Searching & Sorting]12 Minutes
- 38.74Insertion Sort – #1 [Searching & Sorting]12 Minutes
- 38.75Insertion Sort – #2 [Searching & Sorting]12 Minutes
- 38.76Merge Sort – #2 [Searching & Sorting]12 Minutes
- 38.77Merge Sort – #1 [Searching & Sorting]12 Minutes
- 38.78Merge Sort – #3 [Searching & Sorting]10 Minutes
- 38.79Quick Sort – #1 [Searching & Sorting]10 Minutes
- 38.80Quick Sort – #2 [Searching & Sorting]9 Minutes
- 38.81Radix Sort – #1 [Searching & Sorting]10 Minutes
- 38.82Radix Sort – #2 [Searching & Sorting]10 Minutes
- 38.83Heap Data Structure [Applications of Data Structures]11 Minutes
- 38.84Min & Max Heap [Applications of Data Structures]12 Minutes
- 38.85Infix Notation & Conversion [Applications of Data Structures]11 Minutes
- 38.86Prefix, Postfix Notation & Conversion [Applications of Data Structures]12 Minutes
- 38.87Prim’s & Kruskal’s #1 [Applications of Data Structures]19 Minutes
- 38.88Prim’s & Kruskal’s #2 [Applications of Data Structures]10 Minutes
- 38.89Viva Questions – Data Structures
- 38.90[Soln] Mod 1 – Stack, Queue & Linked List [MU – Dec-2024 Importance Solutions]
- 38.91[Soln] Mod 2 – Trees [MU – Dec-2024 Importance Solutions]
- 38.92[Soln] Mod 3 – Graphs [MU – Dec-2024 Importance Solutions]
- 38.93[Soln] Mod 4 – Recursion & Storage Management [MU – Dec-2024 Importance Solutions]
- 38.94[Soln] Mod 5 – Searching & Sorting [MU – Dec-2024 Importance Solutions]
- 38.95[Soln] Mod 6 – Application Of Data Structures [MU – Dec-2024 Importance Solutions]
- Section B - [ Basics of Big Data and Artificial Intelligence ]20
- 39.1Agent and Peas Description [ Artificial Intelligence ]8 Minutes
- 39.2Introduction to Fuzzy Logic [ Artificial Intelligence ]4 Minutes
- 39.3Types of Agent [ Artificial Intelligence ]8 Minutes
- 39.4Learning Agent [ Artificial Intelligence ]8 Minutes
- 39.5Introduction to Machine Learning [ Artificial Intelligence ]12 Minutes
- 39.6Types of Machine Learning [ Artificial Intelligence ]5 Minutes
- 39.7Introduction to Neural Network [ Artificial Intelligence ]6 Minutes
- 39.8Genetic Algorithm [ Artificial Intelligence ]5 Minutes
- 39.9Introduction to Big Data [ Big Data ]6 Minutes
- 39.10Hadoop Part 1 [ Big Data ]10 Minutes
- 39.11Hadoop Part 2 [ Big Data ]10 Minutes
- 39.12Map Reduce [ Big Data ]11 Minutes
- 39.13Introduction to No SQL [ Big Data ]8 Minutes
- 39.14Introduction to Datawarehouse [ Big Data ]10 Minutes
- 39.15Meta Data [ Big Data ]4 Minutes
- 39.16Data mart [ Big Data ]6 Minutes
- 39.17Architecture of Data Warehouse [ Big Data ]7 Minutes
- 39.18What is Olap Operation [ Big Data ]8 Minutes
- 39.19OLAP vs OLTP [ Big Data ]7 Minutes
- 39.20ETL – Extract Transform and Load [ Big Data ]8 Minutes
- Section B - [ AI Notes ]2
- Section B - [ Big Data Notes ]5
- Section B - [ AI Notes for CCAT Exam ]5
- Section B - [ Big Data Notes for CCAT Exam ]1
- Quiz6
- Previous Year Leak Papers6
- CCAT Previous Year Leak Paper Solution8
- Section A [ Verbal ] - Sentence Corrections2
- Section A [ Verbal ] - Error Identification1
- Section A [ Verbal ] - Sentence Completion1
- Section A [ Verbal ] - Sentence Rearrangement2
- Section A [ Verbal ] - Paragraph1
- Section A [ Verbal ] - Fill in the Blanks4

Data Structures & Algorithms
1.
2.
3.
4.
5.
6.


1.Data Structures & Algorithms
★ Data Structures
A data structure is a specialized format for organizing, processing, retrieving and storing
data. There are several basic and advanced types of data structures, all designed to
arrange data to suit a specific purpose. Data structures make it easy for users to access
and work with the data they need in appropriate ways. Most importantly, data structures
frame the organization of information so that machines and humans can better understand
it.
Why are data structures important?
Typical base data types, such as integers or floating-point values, that are available in
most computer programming languages are generally insufficient to capture the logical
intent for data processing and use. Yet applications that ingest, manipulate and produce
information must understand how data should be organized to simplify processing. Data
structures bring together the data elements in a logical way and facilitate the effective
use, persistence and sharing of data. They provide a formal model that describes the way
the data elements are organized.
Some examples of how data structures are used include the following:
● Storing data. Data structures are used for efficient data persistence, such as specifying
the collection of attributes and corresponding structures used to store records in a
database management system.
● Managing resources and services. Core operating system (OS) resources and
services are enabled through the use of data structures such as linked lists for memory
allocation, file directory management and file structure trees, as well as process
scheduling queues.
● Data exchange. Data structures define the organization of information shared between
applications, such as TCP/IP packets.
● Ordering and sorting. Data structures such as binary search trees -- also known as an
ordered or sorted binary tree -- provide efficient methods of sorting objects, such as
character strings used as tags. With data structures such as priority queues,
programmers can manage items organized according to a specific priority.
● Indexing. Even more sophisticated data structures such as B-trees are used to index
objects, such as those stored in a database.

● Searching. Indexes created using binary search trees, B-trees or hash tables speed the
ability to find a specific sought-after item.
● Scalability. Big data applications use data structures for allocating and managing data
storage across distributed storage locations, ensuring scalability and performance.
Types of Data Structure
Basically, data structures are divided into two categories:
● Linear data structure
● Non-linear data structure
Let's learn about each type in detail.
Linear data structures
In linear data structures, the elements are arranged in sequence one after the other. Since
elements are arranged in particular order, they are easy to implement.
However, when the complexity of the program increases, the linear data structures might
not be the best choice because of operational complexities.
Popular linear data structures are:
1. Array Data Structure
In an array, elements in memory are arranged in continuous memory. All the elements of
an array are of the same type. And, the type of elements that can be stored in the form of
arrays is determined by the programming language.
2. Stack Data Structure
In stack data structure, elements are stored in the LIFO principle. That is, the last element
stored in a stack will be removed first.
It works just like a pile of plates where the last plate kept on the pile will be removed
first.

3. Queue Data Structure
Unlike stack, the queue data structure works in the FIFO principle where first element
stored in the queue will be removed first.
It works just like a queue of people in the ticket counter where first person on the queue
will get the ticket first.
4. Linked List Data Structure
In linked list data structure, data elements are connected through a series of nodes. And,
each node contains the data items and address to the next node.
Non linear data structures
Unlike linear data structures, elements in non-linear data structures are not in any
sequence. Instead they are arranged in a hierarchical manner where one element will be
connected to one or more elements.
Non-linear data structures are further divided into graph and tree based data structures.
1. Graph Data Structure
In graph data structure, each node is called vertex and each vertex is connected to other
vertices through edges.

Popular Graph Based Data Structures:
● Spanning Tree and Minimum Spanning Tree
● Strongly Connected Components
● Adjacency Matrix
● Adjacency List
2. Trees Data Structure
Similar to a graph, a tree is also a collection of vertices and edges. However, in tree data
structure, there can only be one edge between two vertices.
Popular Tree based Data Structure
● Binary Tree
● Binary Search Tree
● AVL Tree
● B-Tree
● B+ Tree
● Red-Black Tree

Linear Vs Non-linear Data Structures
Now that we know about linear and non-linear data structures, let's see the major
differences between them.
Data structure operations
● Traversing: Accessing each element exactly once, is known as traversing. It is also
known as visiting.
● Insertion: Adding a new element to data structure.
● Deletion: Removing an element from the data structure.
● Searching: Finding the location of an item in data structure.
● Sorting: Arranging the items in ascending or descending order is known as sorting.
★ Algorithms

In computer programming terms, an algorithm is a set of well-defined instructions to
solve a particular problem. It takes a set of input and produces a desired output. For
example,
An algorithm to add two numbers:
1. Take two number inputs
2. Add numbers using the + operator
3. Display the result
Qualities of Good Algorithms
● Input and output should be defined precisely.
● Each step in the algorithm should be clear and unambiguous.
● Algorithms should be most effective among many different ways to solve a problem.
● An algorithm shouldn't include computer code. Instead, the algorithm should be
written in such a way that it can be used in different programming languages.
Algorithm 1: Add two numbers entered by the user
Step 1: Start
Step 2: Declare variables num1, num2 and sum.
Step 3: Read values num1 and num2.
Step 4: Add num1 and num2 and assign the result to sum.
sum←num1+num2
Step 5: Display sum
Step 6: Stop
Algorithm 2: Find the largest number among three numbers
Step 1: Start
Step 2: Declare variables a,b and c.
Step 3: Read variables a,b and c.
Step 4: If a > b
If a > c
Display a is the largest number.
Else
Display c is the largest number.
Else
If b > c
Display b is the largest number.
Else

Display c is the greatest number.
Step 5: Stop
Algorithm 3: Find Root of the quadratic equation ax2 + bx + c = 0
Step 1: Start
Step 2: Declare variables a, b, c, D, x1, x2, rp and ip;
Step 3: Calculate discriminant
D ← b2-4ac
Step 4: If D ≥ 0
r1 ← (-b+√D)/2a
r2 ← (-b-√D)/2a
Display r1 and r2 as roots.
Else
Calculate real part and imaginary part
rp ← -b/2a
ip ← √(-D)/2a
Display rp+j(ip) and rp-j(ip) as roots
Step 5: Stop
An algorithm is said to be efficient and fast, if it takes less time to execute and consumes
less
memory space. The performance of an algorithm is measured on the basis of following
properties:
1. Time Complexity
2. Space Complexity
Space Complexity
Space complexity of an algorithm is the amount of space it uses for execution in relation
to the size of the input.
n = int(input())
nums = []
for i in range(1, n+1):
nums.append(i*i)

In this example, the length of the list we create depends on the input value we provide for
n.
Let’s say adding a single integer to the list takes c space and other initial operations,
including creating a new list, takes d space. Then, we can create an equation for the space
taken by the above algorithm like this.
when n
-> c*n + d
when n = 10 -> c*10 + d
when n = 100 -> c*100 + d
The value calculated by this equation is the space the algorithm needs to complete
execution. The values of the constants c and d are outside of the control of the algorithm
and depend on factors such as programming language, hardware specifications, etc.
However, we don’t need the exact value this equation calculates to talk about the space
complexity of an algorithm. Instead, we use the highest order of the variable n as a
representative of the space complexity.
For example, the above algorithm has a space complexity in the order of n. If another
algorithm has the equation c*n2 + d*n + e for space it needs, we say it has an order of n2
space complexity.
Time Complexity
Time complexity is the number of elementary operations an algorithm performs in
relation to the input size. Here, we count the number of operations, instead of time itself,
based on the assumption that each operation takes a fixed amount of time to complete.
If we look at the previous algorithm again, it performs n number of operations (n
iterations of the loop) to complete its execution.
If we construct a similar equation for time complexity as we did before, it also takes the
shape of c*n + d, with c as the fixed time taken for each loop iteration and d as the fixed
time taken for other initial operations.
Therefore, the time complexity of this algorithm is also in the order of n.
Asymptotic Analysis
As you saw in these examples, we can’t compare one algorithm to another using exact
values because they depend on the tools we use and underlying hardware. In fact, if we
calculated time and space values for two instances of running the same algorithm on the

same system, there would be differences in the values we get due to subtle changes in the
system environment.
Therefore, we use Asymptotic Analysis to compare the space and time complexity of two
algorithms. It analyzes the performance of an algorithm against the input size. It
evaluates how the performance changes as the input size increases. This type of analysis
doesn’t need actual values for space or time taken by the algorithm for comparison.
Best, Worst, and Average Cases
Usually, in asymptotic analysis, we consider three cases when analyzing an algorithm:
best, worst, and average.
To understand each case, let’s take an example of a linear search algorithm. We use a
simple for loop to search if a given integer k is present in a list named nums of size n.
def linear_search(nums, n, k):
for i in range(n):
if k == nums[i]:
return i
return -1
Let’s consider what are the best, worst, and average case scenarios for this algorithm in
terms of time complexity (We can talk about these three scenarios in terms of space
complexity too).
Best Case
We consider the combination of inputs that allows the algorithm to complete its execution
in the minimum amount of time as the best-case scenario in terms of time complexity.
The execution time in this case acts as a lower bound to the time complexity of the
algorithm.
In linear search, the best-case scenario occurs when k is stored at the 0th index of the list.
In this case, the algorithm can complete execution after only one iteration of the for loop.
nums = [1, 2, 3, 4, 5, 6]
n = 6
k = 1

Worst Case
Worst case scenario occurs when the combination of inputs that takes the maximum
amount of time for completion is passed to the algorithm. The execution time of the
worst case acts as an upper bound to the time complexity of the algorithm.
In linear search, the worst case occurs when k is not present in the list. This takes the
algorithm n+1 iterations to figure out that the number is not in the list.
nums = [1, 2, 3, 4, 5, 6]
n = 6
k = 7
Average Case
To find the average case, we get the sum of running times of the algorithm for every
possible input combination and take their average.
In linear search, the number of iterations the algorithm takes to complete execution
follows this pattern.
When k is stored at the 0th index -> 1 iteration
When k is stored at the 1st index -> 2 iterations
When k is stored at the 2nd index -> 3 iterations
When k is stored at the 3rd index -> 4 iterations:
:
When k is stored at the nth index -> n iterations
When k is not in the list
-> n+1 iterations
So, we can calculate the average running time of the algorithm this way.
★Asymptotic Notation
Asymptotic notation is a mathematical notation used to represent the time and space
complexity of algorithms in asymptotic analysis. We mainly use three asymptotic
notations to represent the best, worst, and average cases of algorithms.
Ω (Big-Omega) Notation
Ω notation denotes an asymptotic lower bound of a function. In other words, it says that
the function should output at least the respective big-omega value for a given input.

For a function g(n), we can define the set of functions denoted by Ω(g(n)) as follows.
Ω(g(n)) = {
f(n): there exist positive constants c and n0 such that
0 <= c*g(n) <= f(n) for all n >= n0
}
It’s a mouthful. But let’s break down this definition with an example and try to
understand what it means.
First, let’s take the function g(n) = n2.
Now, the big-omega of g(n) represents the set of functions that satisfies the condition 0
<= c*g(n) <= f(n) for all n >= n0 when c and n0 are positive constants.
Let’s consider the function f(n) = 2n2 + 4
For c = 1 and n0 = 1, 0 <= c*g(n) <= f(n) for all n >= n0
Therefore, f(n) = Ω(g(n))
Now, if we consider f(n) = 3n + 5, we can’t find values for constants c and n0 that satisfy
the above conditions. Therefore, f(n) = 3n +5 doesn’t belong to big-omega of g(n).
In time and space complexity, Ω notation is used to represent the best-case scenario of an
algorithm. It can provide lower bounds to time and space complexity.
O (Big-O) Notation
O notation denotes an asymptotic upper bound of a function. In other words, the function
should output at most the respective big-O value for a given input.
For a function g(n), the definition of the set O(g(n)) is as follows.
O(g(n)) = {
f(n): there exist positive constants c and n0 such that
0 <= f(n) <= c*g(n) for all n >= n0
}
Again, let’s use an example to understand this definition.
g(n) = n2
f(n) = 2n2 + 4
For c = 5 and n0 = 1, 0 <= f(n) <= c*g(n) for all n >= n0
Therefore, f(n) = O(g(n)

And if we consider f(n) = n3+2, it doesn’t belong to O(g(n)) because no combinations of
values for c and n0 satisfies the required condition.
We use O notation to represent the worst-case time and space complexity of an algorithm.
Θ (Big-Theta) Notation
Θ notation denotes an upper and a lower bound of a function. Therefore, it defines both at
most and at least boundaries for the values the function can take for a given input.
The standard definition of the Θ notation is as follows.
Θ(g(n)) = {
f(n): there exist positive constants c1, c2 and n0 such
that 0 <= c1*g(n) <= f(n) <= c2*g(n) for all n >= n0
}
Let’s use an example to understand this definition with the g(n) and f(n) functions we
used so far.
g(n) = n2
f(n) = 2n2 + 4
For n0 = 1, c0 = 1, and c1 = 5, 0 <= c0*g(n) <= f(n) <=
c1*g(n) for all n >= n0
Therefore, f(n) = Θ(g(n))
Big-theta notation is used to define the average case time and space complexity of an
algorithm.