A Data Structure is a way of organizing and storing data in a computer so that it can be accessed and modified efficiently.
| Linear Data Structure | Non-linear Data Structure |
|---|---|
| Elements arranged sequentially | Elements arranged hierarchically |
| Single traversal path | Multiple traversal paths |
| Example: Array, Stack | Example: Tree, Graph |
An Abstract Data Type (ADT) is a data type that defines data and operations on it without specifying the implementation details.
Asymptotic notation is used to describe the performance or complexity of an algorithm as input size increases (e.g., Big-O, Omega, Theta).
Time complexity is the measure of how much time an algorithm takes to execute as a function of input size.
Recursion is a technique where a function calls itself to solve smaller instances of a problem.
| Linear Search | Binary Search |
|---|---|
| Sequential search | Divide and search |
| Works on unsorted data | Requires sorted data |
| Time: O(n) | Time: O(log n) |
- Step 1: Start from first element
- Step 2: Compare key with each element
- Step 3: If match found → return position
- Step 4: If not found → return -1
Dynamic memory allocation is the process of allocating memory at runtime using functions like malloc(), calloc(), etc.
- Storing multiple values of same type
- Used in matrices and tables
✔ Short ✔ Exam-ready ✔ Easy to memorize
Asymptotic notation is used to analyze the performance of algorithms in terms of input size (n).
🔹 BIG-O NOTATION (O)
Represents upper bound (worst case)
f(n) = O(g(n)) if f(n) ≤ c*g(n)
Example
f(n) = 3n² + 5n + 2 → O(n²)
🔹 OMEGA NOTATION (Ω)
Represents lower bound (best case)
Example
Linear Search → Ω(1)
🔹 THETA NOTATION (Θ)
Represents tight bound
Example
f(n) = 2n + 3 → Θ(n)
| Notation | Meaning | Case |
|---|---|---|
| O | Upper bound | Worst |
| Ω | Lower bound | Best |
| Θ | Exact bound | Average |
Algorithm analysis determines efficiency in terms of:
- Time Complexity
- Space Complexity
Example
for(i=1; i<=n; i++)
{
print(i);
}
Time = O(n)
for(i=1; i<=n; i++)
{
for(j=1; j<=n; j++)
{
print(i,j);
}
}
Time = O(n²)
Recursion is a technique where a function calls itself.
Factorial(n)
{
if(n == 0)
{
return 1;
}
else
{
return n * Factorial(n-1);
}
}
for(i=0; i
Insertion Example:
[10,20,30] → [10,15,20,30]
Deletion Example:
[10,20,30] → [10,30]
for(i=0; i
while(low <= high)
{
mid = (low + high)/2;
if(a[mid] == key)
{
return mid;
}
else if(key < a[mid])
{
high = mid - 1;
}
else
{
low = mid + 1;
}
}
for(i=0; i a[j+1])
{
swap(a[j], a[j+1]);
}
}
}
for(i=1; i=0 && a[j] > key)
{
a[j+1] = a[j];
j--;
}
a[j+1] = key;
}
| Algorithm | Best | Average | Worst |
|---|---|---|---|
| Bubble | O(n) | O(n²) | O(n²) |
| Selection | O(n²) | O(n²) | O(n²) |
| Insertion | O(n) | O(n²) | O(n²) |
| Merge | O(n log n) | O(n log n) | O(n log n) |
| Quick | O(n log n) | O(n log n) | O(n²) |
| Radix | O(nk) | O(nk) | O(nk) |
0 Comments