Hi everyone this github is here with all the different data structures of DSA,their algorithms and some solutions to problems.I hope this might help you guys...
An array is a data structure that can store a fixed-size sequence of elements of the same type. Each element in an array is accessed by its index, which is an integer that indicates the position of the element within the array. Arrays are useful for organizing data that can be processed in a sequence, such as lists of numbers, strings, or other objects.
Arrays are essential for several reasons:
- Efficient Data Access: Arrays provide constant-time access to elements using their index.
- Memory Efficiency: Arrays allocate a contiguous block of memory for elements, which can lead to better cache performance.
- Simplicity: Arrays are simple to implement and use for a wide range of problems.
- Data Organization: Arrays help organize data in a structured manner, making it easier to perform operations like searching, sorting, and iterating.
- One-dimensional Array: A linear sequence of elements.
- Multi-dimensional Array: Arrays with more than one dimension, like 2D arrays (matrices) or 3D arrays.
- Dynamic Arrays: Arrays that can resize themselves during runtime (e.g.,
std::vector
in C++).
- Initialization: Declaring and initializing an array.
- Accessing Elements: Accessing elements using their index.
- Modifying Elements: Changing the value of elements.
- Traversing: Iterating through all elements.
- Insertion: Adding elements at a specific position.
- Deletion: Removing elements from a specific position.
- Searching: Finding elements based on certain criteria.
- Sorting: Rearranging elements in a specific order.
A stack is a linear data structure that follows the Last In, First Out (LIFO) principle. This means that the last element added to the stack is the first one to be removed.
- Push: Add an element to the top of the stack.
- Pop: Remove the element from the top of the stack.
- Peek/Top: Get the value of the top element without removing it.
- isEmpty: Check if the stack is empty.
A stack can be represented using arrays, linked lists, or dynamic data structures in various programming languages. The most common methods involve using arrays or linked lists.
A queue is a linear data structure that follows the First In, First Out (FIFO) principle. This means that the first element added to the queue will be the first one to be removed. Queues are commonly used in scenarios where order needs to be preserved, such as task scheduling, handling requests in a web server, and breadth-first search in graphs.
- Enqueue: Add an element to the end of the queue.
- Dequeue: Remove the element from the front of the queue.
- Front/Peek: Get the value of the front element without removing it.
- isEmpty: Check if the queue is empty.
- isFull: Check if the queue is full (only applicable to array-based queues).
- Size: Get the current size of the queue.
A queue can be represented using arrays, linked lists, or dynamic data structures in various programming languages. The most common methods involve using arrays or linked lists.
A linked list is a linear data structure where each element, called a node, contains two parts:
- Data: The value stored in the node.
- Pointer: A reference to the next node in the sequence.
Unlike arrays, linked lists do not store elements in contiguous memory locations. This allows for efficient insertions and deletions.
Linked lists are useful because:
- They allow for dynamic memory allocation.
- They provide efficient insertions and deletions compared to arrays.
- They can easily grow in size without requiring a large block of memory.
- Singly Linked List: Each node points to the next node.
- Doubly Linked List: Each node points to both the next and the previous node.
- Circular Linked List: The last node points to the first node, forming a circle.
A singly linked list is a type of linked list in which each node contains two parts:
- Data: The value stored in the node.
- Pointer: A reference to the next node in the sequence.
In a singly linked list, each node points to the next node in the list, and the last node points to nullptr
indicating the end of the list.
Singly linked lists are useful because:
- They allow for dynamic memory allocation.
- They provide efficient insertions and deletions compared to arrays.
- They can easily grow in size without requiring a large block of contiguous memory.
- Insertion: Adding a new node to the list.
- Deletion: Removing a node from the list.
- Traversal: Visiting each node in the list.
- Search: Finding a node with a given value.
Searching algorithms are designed to retrieve information stored within some data structure. These algorithms are fundamental to computer science and are used in various applications, from databases to machine learning.
Linear search is the simplest search algorithm. It checks each element in the list until it finds the target element or reaches the end of the list.
- Best case: O(1) (element is at the first position)
- Average case: O(n)
- Worst case: O(n) (element is not in the list or at the last position)
- O(1) (iterative)
Binary search is a more efficient search algorithm that works on sorted arrays. It repeatedly divides the search interval in half and compares the target value to the middle element.
- Best case: O(1) (element is at the middle)
- Average case: O(log n)
- Worst case: O(log n) (element is not in the list)
- O(1) (iterative)
- O(log n) (recursive, due to call stack)
Sorting is a fundamental operation in computer science, used to arrange data in a specific order. Various sorting algorithms are designed to efficiently sort data in different scenarios. This section covers Bubble Sort, Selection Sort, Insertion Sort, Quick Sort, and Merge Sort.
Bubble Sort repeatedly steps through the list, compares adjacent elements, and swaps them if they are in the wrong order. This process is repeated until the list is sorted.
- Compare each pair of adjacent elements from the beginning of the list.
- If a pair is in the wrong order, swap them.
- Repeat the process for each element, excluding the last one each time, since the largest element will have "bubbled" to the end.
function bubbleSort(array):
n = length(array)
for i from 0 to n-1:
for j from 0 to n-i-2:
if array[j] > array[j+1]:
swap(array[j], array[j+1])
- Best case: O(n) (when the array is already sorted)
- Average case: O(n^2)
- Worst case: O(n^2)
- O(1)
Selection Sort divides the list into a sorted and an unsorted part. It repeatedly selects the smallest (or largest) element from the unsorted part and moves it to the end of the sorted part.
- Find the minimum element in the unsorted part.
- Swap it with the first element of the unsorted part.
- Move the boundary of the sorted part one element forward.
- Repeat the process until the entire array is sorted.
function selectionSort(array):
n = length(array)
for i from 0 to n-1:
minIndex = i
for j from i+1 to n-1:
if array[j] < array[minIndex]:
minIndex = j
swap(array[minIndex], array[i])
- Best case: O(n^2)
- Average case: O(n^2)
- Worst case: O(n^2)
- O(1)
Insertion Sort builds the sorted array one item at a time. It takes each element from the input and finds its correct position in the sorted part.
- Start with the second element, as the first element is considered sorted.
- Compare the current element with the elements in the sorted part and shift the sorted elements to the right to create the correct position for the current element.
- Insert the current element in its correct position.
- Repeat the process for all elements.
function insertionSort(array):
n = length(array)
for i from 1 to n-1:
key = array[i]
j = i-1
while j >= 0 and array[j] > key:
array[j+1] = array[j]
j = j-1
array[j+1] = key
- Best case: O(n)
- Average case: O(n^2)
- Worst case: O(n^2)
- O(1)
Quick Sort is a divide-and-conquer algorithm. It works by selecting a 'pivot' element from the array and partitioning the other elements into two sub-arrays, according to whether they are less than or greater than the pivot.
- Choose a pivot element.
- Partition the array into two parts: elements less than the pivot and elements greater than the pivot.
- Recursively apply the above steps to the sub-arrays.
- Combine the sub-arrays and the pivot into a single sorted array.
function quickSort(array, low, high):
if low < high:
pivotIndex = partition(array, low, high)
quickSort(array, low, pivotIndex - 1)
quickSort(array, pivotIndex + 1, high)
function partition(array, low, high):
pivot = array[high]
i = low - 1
for j from low to high - 1:
if array[j] < pivot:
i = i + 1
swap(array[i], array[j])
swap(array[i + 1], array[high])
return i + 1
- Best case: O(n log n)
- Average case: O(n log n)
- Worst case: O(n log n)
- O(n)
Merge Sort is a divide-and-conquer algorithm. It works by recursively dividing the array into two halves, sorting each half, and then merging them back together.
- Divide the array into two halves.
- Recursively sort each half.
- Merge the sorted halves back together.
function mergeSort(array):
if length of array <= 1:
return array
mid = length of array / 2
leftHalf = mergeSort(first half of array)
rightHalf = mergeSort(second half of array)
return merge(leftHalf, rightHalf)
function merge(leftHalf, rightHalf):
result = empty array
while leftHalf is not empty and rightHalf is not empty:
if first element of leftHalf <= first element of rightHalf:
append first element of leftHalf to result
remove first element from leftHalf
else:
append first element of rightHalf to result
remove first element from rightHalf
append remaining elements of leftHalf to result
append remaining elements of rightHalf to result
return result
- Best case: O(n log n)
- Average case: O(n log n)
- Worst case: O(n log n)
- O(n)
Having covered the basics of linear data structures, let's delve into non-linear structures, starting with the Tree.
The Tree data structure resembles an inverted tree from nature, featuring a root and leaves. The root is the initial node, and the leaves are at the bottom-most level. Notably, there's only one path between any two nodes in a tree.
Based on the maximum number of children a node can have:
- Binary Tree: Each node can have a maximum of 2 children.
- Ternary Tree: Each node can have a maximum of 3 children.
- N-ary Tree: A node can have at most N children.
Additional classifications based on node configuration include:
- Complete Binary Tree: All levels are filled, except possibly for the last level, which is filled from the left as much as possible.
- Perfect Binary Tree: All levels are filled.
- Binary Search Tree: A special binary tree where smaller nodes are on the left, and higher value nodes are on the right.
- Ternary Search Tree: Similar to a binary search tree, but with nodes having at most 3 children.
Resources: Trees
◌ Tree Data Structure |
geeksforgeeks.org |
◌ Heaps (priority queue) |
viterbi-web.usc.edu |
◌ Heaps |
visualgo.net |
◌ Priority Queues: Lecture Notes |
cs.cmu.edu |
◌ UNION-FIND DISJOINT SETS (UFDS) |
visualgo.net |
◌ DISJOINT-SET DATA STRUCTURES |
topcoder.com |
◌ Disjoint set (Union-Find): Lecture Notes |
harvard.edu |
◌ Segment Trees: MIN SEGMENT TREE |
visualgo.net |
◌ RANGE MINIMUM QUERY AND LOWEST COMMON ANCESTOR |
topcoder.com |
◌ Segment Trees |
iarcs.org.in |
◌ BINARY INDEXED TREES: TopCoder |
topcoder.com |
◌ Binary Index Tree (Fenwick tree) |
visualgo.net |
◌ Binary Index Tree: ICO |
iarcs.org.in |
◌ Trees (traversals) |
berkeley.edu |
◌ Dynamic programming on trees |
iarcs.org.in |
Practice Problems: Trees
Moving on to another crucial non-linear structure, let's explore the Graph. Unlike the Tree, a Graph lacks a specific root or leaf node and allows traversal in any order.
A Graph is a non-linear structure composed of a finite set of vertices (or nodes) and a set of edges connecting pairs of nodes. It proves invaluable in solving various real-life problems. Graphs can take different forms based on edge orientation and node characteristics.
Key concepts to explore:
- Types of Graphs: Varying types based on connectivity or weights of nodes.
- Introduction to BFS and DFS: Algorithms for traversing through a graph.
- Cycles in a Graph: Series of connections leading to a loop.
- Topological Sorting in the Graph
- Minimum Spanning Tree in Graph
Resources: Graphs
◌ Graph Data Structure And Algorithms |
geeksforgeeks.org |
◌ Depth First Search or DFS for a Graph |
geeksforgeeks.org |
◌ GRAPH TRAVERSAL (DFS/BFS) |
visualgo.net |
◌ Dijkstra’s shortest path algorithm |
geeksforgeeks.org |
◌ SINGLE-SOURCE SHORTEST PATHS |
visualgo.net |
◌ Bellman Ford Algorithm |
geeksforgeeks.org |
◌ One Source Shortest Path |
compprog.wordpress.com |
◌ Minimum spanning tree |
cs.princeton.edu |
◌ Articulation points |
iarcs.org.in |
◌ Strongly connected components |
iarcs.org.in |
◌ Topological Sorting |
geeksforgeeks.org |
◌ Euler Paths and Euler Circuits |
jlmartin.ku.edu |
◌ Fast Modulo Multiplication |
codechef.com |
◌ Algos for Calculating nCr % M |
codechef.com |
Practice Problems: Graphs
◌ Two Closest |
codechef.com: PAIRCLST |
◌ Special Shortest Walk |
codechef.com: SPSHORT |
◌ Robot Control |
codeforces.com: 346/D |
◌ Arbitrage |
spoj.com: ARBITRAG |
◌ Cost |
spoj.com: HIGHWAYS |
◌ Police Query |
spoj.com: POLQUERY |
◌ Visiting Friends |
codechef.com: MCO16405 |
◌ Chef and Roads |
codechef.com: CL16BF |
◌ Codechef Password Recovery |
codechef.com: CHEFPASS |
◌ Tanya and Password |
codeforces.com: contest/508/problem/D |
◌ One-Way Reform |
codeforces.com: contest/723/problem/E |
◌ Problem Statement for NetworkSecurity |
topcoder.com |
◌ Leetcode: Interview Practice |
Leetcode: Practice Graphs |
Recursion stands out as a vital algorithm leveraging the concept of code reusability and repeated code usage. Its significance extends to being the foundation for many other algorithms, including:
- Tree Traversals
- Graph Traversals
- Divide and Conquer Algorithms
- Backtracking Algorithms
To explore Recursion thoroughly, refer to the following articles/links:
Resources: Recursion
◌ AN INTRODUCTION TO RECURSION PART ONE |
topcoder.com |
◌ AN INTRODUCTION TO RECURSION PART TWO |
topcoder.com |
◌ Introduction to Recursion |
geeksforgeeks.org |
◌ Backtracking, Memoization & Dynamic Programming! |
loveforprogramming.quora.com |
◌ Recursion Interview Questions & Tips |
interviewing.io |
Practice Problems: Recursion
◌ Connecting Soldiers |
codechef.com: NOKIA |
◌ Fit Squares in Triangle |
codechef.com: TRISQ |
◌ Leetcode: Interview Practice |
Leetcode: Practice Recursion |
Derived from Recursion, the Backtracking algorithm allows for retracing if a recursive solution fails, exploring alternative solutions. It systematically tries out all possible solutions to find the correct one.
Backtracking is an algorithmic technique that incrementally builds a solution, removing failed solutions that don't meet problem constraints.
Key problems to tackle in Backtracking algorithms:
- Knight’s Tour Problem
- Rat in a Maze
- N-Queen Problem
- Subset Sum Problem
- M-Coloring Problem
- Hamiltonian Cycle
- Sudoku
Resources: Backtracking
◌ Backtracking Algorithms |
geeksforgeeks.org |
◌ Recursion and Backtracking |
codeforces.com |
◌ Backtracking:the essential part of dynamic programming |
codeforces.com |
◌ Backtracking, Memoization & Dynamic Programming! |
loveforprogramming.quora.com |
◌ Backtracking Archives |
geeksforgeeks.org |
Practice Problems: Backtracking
◌ Leetcode: Interview Practice |
Leetcode: Practice Backtracking |
Dynamic Programming stands as a crucial algorithm, serving as an optimization over plain recursion. It becomes particularly valuable when a recursive solution involves repeated calls for the same inputs, allowing for optimization.
Those who cannot remember the past are condemned to repeat it.
- Dynamic Programming
Key concepts to explore in Dynamic Programming:
- Tabulation vs Memoization
- Optimal Substructure Property
- Overlapping Subproblems Property
- Bitmasking and Dynamic Programming
- Bitmasking and Dynamic Programming
- Digit DP
Resources: Basic Dynamic Programming
◌ Demystifying Dynamic Programming |
freecodecamp.org |
◌ DP Tutorial and Problem List |
codeforces.com |
◌ DYNAMIC PROGRAMMING: FROM NOVICE TO ADVANCED |
topcoder.com |
◌ Dynamic Programming |
geeksforgeeks.org |
◌ Backtracking, Memoization & Dynamic Programming! |
loveforprogramming.quora.com |
Practice Problems: Basic Dynamic Programming
◌ Alternating subarray prefix |
codechef.com: ALTARAY |
◌ Subtraction Game 2 |
codechef.com: AMSGAME2 |
◌ Striver DP Series |
takeuforward.org |
◌ Leetcode: Interview Practice |
Leetcode: Practice Dynamic Programming |