How much data can we pack into a given space?
- 8 minutes read - 1528 wordsTable of Contents
All of our data and knowledge would fit in a spoonful of DNA due to the high information density. To store the same amount of data, a data center would require many racks full of disk arrays. In addition, the data center would consume a significant amount of energy in comparison to the DNA.
Is it possible to store information 619 septillion times more densely than it is now?
Storing data in DNA as a starting point
DNA has the capacity to store information and even has the ability to replicate itself. It continues to do so even after millions of years have passed.
Due to the extremely high information density that was achieved, it is now possible to store all of our data and knowledge in DNA that has the volume of a single spoon of liquid (http://www.ibtimes.co.uk/dna-data-storage-1-8-zettabytes-information-single-spoon-liquid-1488529) .
When you enter a data center, you will see a large number of racks housing hundreds or even thousands of disk arrays. These arrays generate a significant amount of heat and noise. These data centers only account for a negligible portion of the 1.8 Zettabytes that we have accumulated up to this point.
Our data currently take up square kilometers of space, and this number is only expected to increase. They are expending a significant amount of energy; could all of this be stored in a teaspoon of DNA?
Comparison&Calculation
Data are stored on hard disks and increasingly on solid-state disks (essential RAM) (essential RAM). The volume of a solid-state drive (SSD) that has a capacity of 4 TB and measures 2.5 inches in height is 316.920 millimeters cubed.
That comes out to a rough assumption of 0.00026 TB per mm3 for each unit of measurement. Or 260.000 bytes per mm^3 resp. 2.080.000 bits per mm^3.
That translates to 59 bits per mm, which indicates that a single bit has a size of 0.01 mm, as that is how many bits there are per mm. Tablespoons have a volume of 15 milliliters (SI prefixes applied to the liter; see http://en.wikipedia.org/wiki/Litre#SI prefixes applied to the liter).
Which is equivalent to 15,000 millimeters cubed. If 1.8 Zettabytes can be stored in a spoon, then there are 1.44 x 10-22 bits in a spoon, which is equivalent to 986484 bits per mm. That indicates that the size of a bit in the DNA is 0.000001 millimeters. This means that DNA stores information in a format that is 1,000,000 times more compact than a contemporary Enterprise SSD.
Minimal Size
What could possibly be the smallest size in our universe? After doing some research, I came across the Planck length .
A Planck length is equal to 1.616 x 10-32 millimeters. There is nothing in our discrete universe that extends beyond the Planck length.
If one were to make the extremely speculative assumption that a bit could be stored at the length of a Planck, the smallest possible size for a bit would be 0.00000000000000000000000000001616 millimeters.
- Enterprise SSD (4TB) 0.01 mm
- DNA 0.000001 mm
- Planck 0.00000000000000000000000000001616 mm
This suggests that information could be stored theoretically 619 septillion times more densely than it is today. It also means that exploring the space between an atom and Planck length, or the so-called strings, means exploring a huge space.
What is ChatGPT saying about information density?
According to ChatGPT the physical limits are Schwarzschild radius Bekenstein bound Speed of light
This blog is about the Planck length and information density. However, in any case, the information density is limited by physical laws or, more specifically by our knowledge of physical laws.
Prompt: „What is maximum possible information density, based on known physical knowledge“
Answers: “ The maximum possible information density that can be achieved based on known physical knowledge is ultimately determined by the fundamental nature of reality and the laws of physics that govern it. At the most basic level, information is encoded in the physical state or configuration of matter and energy, and the maximum possible information density of a given system is therefore limited by the maximum possible density of matter and energy in that system.
According to current physical theories, the maximum possible information density of a system is ultimately limited by the Schwarzschild radius of the system, which is the distance from the center of the system at which the escape velocity exceeds the speed of light. At this point, the system becomes a black hole, and information cannot be transmitted or accessed from the outside.
…One of the most well-known limits on the maximum possible information density is the so-called “Bekenstein bound,“ which states that the information contained within a region of space is limited by the surface area of the region, regardless of the volume of the region. This means that, in order to store more information, the surface area of the storage medium must be increased…
Other physical limits on the maximum possible information density include the speed of light, which limits the rate at which information can be transmitted, and the uncertainty principle in quantum mechanics, which places limits on the precision with which certain properties of particles can be measured.
In practice, however, it is difficult to approach the theoretical maximum possible information density in real-world systems due to various practical constraints and limitations. Nonetheless, advances in technology and our understanding of the physical world continue to push the boundaries of what is possible, and it is likely that we will continue to see increases in the information density of various systems over time. “
The amount of information available and the metaverse
The digitalization process includes the incorporation of the Metaverse. This indicates that there will be an increase in the amount of data produced. Knowing the capabilities and constraints of data storage from a financial point of view enables one to make an educated guess about how far the Multiverse can expand.
Using thought experiments, one can investigate whether DNA or other theoretical storage options allow for this. The obvious takeaway from this is that there is a tremendous amount of untapped potential to increase the information density of data storage.
Conclusions
Either the physical laws themselves or, more importantly, our comprehension of those laws, place bounds on the information density.
The magnitude of the data has a significant influence on the economy. Our current technological world wouldn’t be possible without the capacity to store data in the most compact form possible. Twitter, Facebook, Google, as well as large corporations, government agencies, and individual users, are all producing and storing an increasing amount of data. This is only possible because we are able to lower the costs associated with storage on an annual basis.
Because of this, the question “where are our limits?” makes a great deal of sense.
How much information is able to be saved? Are we able to keep expanding our database indefinitely? How could something like this be possible? How quickly are we able to store them and read them?
From 289,000 USD / GB in 1981 (IBM 19 MB HDD) to 0.02 USD / GB in 2015 (as a reference point) an HDD, the price of a GB of storage space in the United States dollar has decreased significantly. Although solid-state drives (SSDs) currently have a higher cost per gigabyte than hard disk drives (HDDs), SDDs are significantly faster than HDDs. This indicates that the cost of storing a 19 MB file in 1981 would have been 289,000 USD, whereas the cost in 2015 is 0.00038 USD.
The price was cut 14.45 million times over the course of 35 years of development, beginning in 1981 and ending in 2015. This could be driven even further with the application of the potential of a hypothetical DNA or protein storage. A storage method based on DNA or proteins would have the potential to drive up costs even further.
The conclusion that can be drawn from this is that 1.8 zettabytes of data are not a very large amount of data. It is possible to store information at the Planck length, at least in theory. The length of 1.8 zettabytes would be 0.0000002 millimeters, which is so small that it cannot be seen by the naked eye.