What Is A Zettabyte and How Is It Managed? [Infographic]

Cisco the worldwide leader in networking as part of its visual networking index initiative has predicted that annual global internet protocol traffic will pass the zettabyte level sometime during the year 2016.

What is a Zettabyte?

A zettabyte is equivalent to about 250 billion DVDs. It also measures the amount of storage across the globe in servers.  If one would take each zettabyte and line the group into a single file it would reach the perimeter of our galaxy the Milky Way and span nearly 100,000 light years. It matters because zettabyte data has to become manageable so that it becomes useful. Quick access to information across the Internet involves planning how to handle, safeguard, and transport data in seconds.

Data Management

Six core areas to data management exist. Consider integrity of the data, availability, security, capacity of the servers, performance of the servers, and scalability. Each area has a relationship with all other areas. To create optimum zettabyte data movement an ebb and flow is established that creates balances as information is transported.

Integrity of the Data

How data becomes retrieved and then stored matters. Data must have credibility and no corruption. It becomes a matter of trust across the thousands of servers and millions of computers. Sometimes events happen out in cyberspace intentionally and unintentionally. Mechanisms that correct errors, ensure retrieval, and backup systems make sitting down at a computer at work or home enjoyable to use and efficient. Systems establishing policies, procedures, and contingency plans in case of an interruption make information flow smoothly.


A data center must track all bits and bytes entering the servers. Also the architecture of the storage system must come in some type of orderly arrangement that can retrieve the bits and bytes requested in seconds and present it in a useable form.  Information that is unavailable due to a problem with the servers or on the net itself costs millions of dollars per hour. Financial centers, telecommunications industries, and e-commerce depend on data becoming available as soon as they access it.


Centers are housed throughout the world, and each one must determine ways to protect their physical premises from unwarranted access.  Policies protecting core server areas are a must. Protecting access from remote locations such as from homes or smart phone ensure data integrity. Included in this section is environmental monitoring such as ITWatchDog monitor POE that watches humidity and temperature then makes adjustments to give optimum environmental conditions for data transmission. Disaster response plans for data retrieval in case of an emergency makes business go on as usual in minutes. Developing outside storage areas such as mirror sites, cloud storage, or web services keeps data secure.


Regular maintenance and cleaning helps catch issues early when they are immediately solvable. It lets one trace the root cause of an issue, and where it truly originates from.  As time passes and an issue is unresolved it becomes compounded and clouded by other issues letting the root cause, the source, off the hook allowing it to corrupt or attack other servers. Analyzing the usefulness of connections to the public Internet lets one make good choices for their business or needs.


Scalability comes as the concept that a business, personal use, and Internet will continue to grow. Room space, present equipment and storage space all serve as indicators of when and where to expand. Also consider the numerous web services appearing that manage data as well.



All the elements that make data transfer and perform efficiently have to have a balance. Regular tweaking of procedures and equipment, automating redundant loops that are basic go a long way to easy manageability. The human element comes just as important. Computers and servers are only as good as the input it receives, and it is the human evaluation of prioritizing needs that matters especially in crises situations.

In the next three to four years the zettabyte will come. New ways of thinking, new ways of storing, and new ways of managing must be developed. After all the universe of cyber space is approaching the galaxy level of our physical universe, metaphorically writing of course.

Take a look at this infographic which explains more about the Zetabyte and how it will affect us in the future.


Milky Way fact and definition of zettabyte courtesy of NASA

Information given and verified by Jake Page, electrical engineer and analyst

This is a guest post from Derek Newman who writes from ITWatchDogs, which offers environmental and temperature monitor tools that monitor power, humidity, light, airflow and much more. Install these into your server room, data center, cold storage, research lab or other mission critical facilities to prevent equipment failure or downtime.