Quote:
I'm new to
computer science and I can't understand the concept of sorting 100 GB in 1 GB and I would like your help with solving this specific problem.
Probably because there is no such concept.
HDD storage and memory are essentially the same thing, the difference is that memory space is smaller and memory is faster.
Because memory is natural work place of processor and file is not, the mean of reading and writing is different in code, but they are similar in principle.
The main difference I see is that sequential read/write in a file is more efficient than random read/write.
Quote:
In sorting mergesort is the fastest sorting method with O(nlogn) complexity and b+ trees are good for sorting big amounts of data so I am thinking it has to be one of these 2 methods .
There is much more sorting methods and each have its advantage.
Note that a sorting method is not linked to the size of data, some are just more efficient that others in particular situations.
So Merge sort is a good option. Remember you compare only 2 values at a time.
You can use an hybrid method, merge sort small chunks (less that half memory size) in memory, and larger chinks on file.
Advice: study sorting algortithms
Sorting algorithm - Wikipedia[
^]
Sorting Algorithms - GeeksforGeeks[
^]