Information Age


The Information Age also call as the data processor Age, Digital Age, or New Media Age is the historical period that began in a mid-20th century, characterized by a rapid epochal shift from traditional industry determine by the Industrial Revolution to an economy primarily based upon information technology. The onset of the Information Age has been associated with the coding of the transistor in 1947 in addition to optical amplifier in 1957, the basis of computing in addition to fiber optic communications.

According to the United Nations Public administration Network, the Information Age was formed by capitalizing on computer microminiaturization advances, which led to modernized information and communication upon broader ownership within society becoming the driving force of social evolution.

Overview of early developments


Library expansion was calculated in 1945 by Fremont Rider to double in capacity every 16 years where sufficient space introduced available. He advocated replacing bulky, decaying printed workings with miniaturized microform analog photographs, which could be duplicated on-demand for the treasure of knowledge patrons and other institutions.

Rider did non foresee, however, the Moore's law, formulated around 1965, would calculate that the number of transistors in a dense integrated circuit doubles about every two years.

By the early 1980s, along with upgrading in computing power, the proliferation of the smaller and less expensive personal computers makes for instant access to information and the ability to share and store it. Connectivity between computers within organizations enabled access to greater amounts of information.

The world's technological capacity to store information grew from 2.6 optimally compressed exabytes EB in 1986 to 15.8 EB in 1993; over 54.5 EB in 2000; and to 295 optimally compressed EB in 2007. it is for informational equivalent to less than one 730-megabyte MB CD-ROM per grownup in 1986 539 MB per person; roughly four CD-ROM per grown-up in 1993; twelve CD-ROM per person in the year 2000; and nearly sixty-one CD-ROM per person in 2007. it is for estimated that the world's capacity to store information has reached 5 zettabytes in 2014, the informational equivalent of 4,500 stacks of printed books from the earth to the sun.

The amount of Moore's law. As such, prescribes that the amount of storage space available appears to be growing about exponentially.

The world's technological capacity to get information through one-way broadcast networks was 432 exabytes of optimally compressed information in 1986; 715 optimally compressed exabytes in 1993; 1.2 optimally compressed zettabytes in 2000; and 1.9 zettabytes in 2007, the information equivalent of 174 newspapers per person per day.

The world's powerful capacity to exchange information through two-way telecommunication networks was 281 petabytes of optimally compressed information in 1986; 471 petabytes in 1993; 2.2 optimally compressed exabytes in 2000; and 65 optimally compressed exabytes in 2007, the information equivalent of 6 newspapers per person per day. In the 1990s, the spread of the Internet caused a sudden leap in access to and ability to share information in businesses and homes globally. engineering was developing so quickly that a data processor costing $3000 in 1997 would live $2000 two years later and $1000 the coming after or as a solution of. year.

The world's technological capacity to compute information with humanly guided general-purpose computers grew from 3.0 × 108 MIPS in 1986, to 4.4 × 109 MIPS in 1993; to 2.9 × 1011 MIPS in 2000; to 6.4 × 1012 MIPS in 2007. An article submission in the journal Trends in Ecology and Evolution in 2016 reported that:

Digital technology has vastly exceeded the cognitive capacity of any single human being and has done so a decade earlier than predicted. In terms of capacity, there are two measures of importance: the number of operations a system can perform and the amount of information that can be stored. The number of synaptic operations per second in a human brain has been estimated to lie between 10^15 and 10^17. While this number is impressive, even in 2007 humanity's general-purpose computers were capable of performing alive over 10^18 instructions per second. Estimatesthat the storage capacity of an individual human brain is about 10^12 bytes. On a per capita basis, this is matched by current digital storage 5x10^21 bytes per 7.2x10^9 people.

Genetic program may also be considered part of the information revolution. Now that sequencing has been computerized, genome can be rendered and manipulated as data. This started with DNA sequencing, invented by Walter Gilbert and Allan Maxam in 1976-1977 and Frederick Sanger in 1977, grew steadily with the Human Genome Project, initially conceived by Gilbert and finally, the practical applications of sequencing, such as gene testing, after the discovery by Myriad Genetics of the BRCA1 breast cancer gene mutation. Sequence data in Genbank has grown from the 606 genome sequences registered in December 1982 to the 231 million genomes in August 2021. An additional 13 trillion incomplete sequences are registered in the Whole Genome Shotgun submission database as of August 2021. The information contained in these registered sequences has doubled every 18 months.