Pages

Thursday, April 8, 2010

History of Television


In its early stages of development, television employed a combination of optical, mechanical and electronic technologies to capture, transmit and display a visual image. By the late 1920s, however, those employing only optical and electronic technologies were being explored. All modern television systems rely on the latter, although the knowledge gained from the work on mechanical-dependent systems was crucial in the development of fully electronic television.
American family watching TV, 1958

The first time images were transmitted electrically via early mechanical fax machines, including the pantelegraph, developed in the late 1800s. The concept of electrically-powered transmission of television images in motion, was first sketched in 1878 as the telephonoscope, shortly after the invention of the telephone. At the time, it was imagined by early science fiction authors, that someday that light could be transmitted over wires, as sounds were.

The idea of using scanning to transmit images was put to actual practical use in 1881 in the pantelegraph, through the use of a pendulum-based scanning mechanism. From this period forward, scanning in one form or another, has been used in nearly every image transmission technology to date, including television. This is the concept of "rasterization", the process of converting a visual image into a stream of electrical pulses.

In 1884 Paul Gottlieb Nipkow, a 23-year old university student in Germany, patented the first electromechanical television system which employed a scanning disk, a spinning disk with a series of holes spiraling toward the center, for rasterization. The holes were spaced at equal angular intervals such that in a single rotation the disk would allow light to pass through each hole and onto a light-sensitive selenium sensor which produced the electrical pulses. As an image was focused on the rotating disk, each hole captured a horizontal "slice" of the whole image.

Nipkow's design would not be practical until advances in amplifier tube technology became available. The device was only useful for transmitting still "halftone" images — represented by equally spaced dots of varying size — over telegraph or telephone lines.[citation needed] Later designs would use a rotating mirror-drum scanner to capture the image and a cathode ray tube (CRT) as a display device, but moving images were still not possible, due to the poor sensitivity of the selenium sensors. In 1907 Russian scientist Boris Rosing became the first inventor to use a CRT in the receiver of an experimental television system. He used mirror-drum scanning to transmit simple geometric shapes to the CRT.[3]

Scottish inventor John Logie Baird demonstrated the transmission of moving silhouette images in London in 1925, and of moving, monochromatic images in 1926. Baird's scanning disk produced an image of 30 lines resolution, just enough to discern a human face, from a double spiral of lenses.[citation needed]. Remarkably, in 1927 Baird also invented the world's first video recording system, "Phonovision" — by modulating the output signal of his TV camera down to the audio range he was able to capture the signal on a 10-inch wax audio disc using conventional audio recording technology. A handful of Baird's 'Phonovision' recordings survive and these were finally decoded and rendered into viewable images in the 1990s using modern digital signal-processing technology[4].

In 1926, Hungarian engineer Kálmán Tihanyi designed a television system utilizing fully electronic scanning and display elements, and employing the principle of "charge storage" within the scanning (or "camera") tube.[5][6][7][8]

By 1927, Russian inventor Léon Theremin developed a mirror drum-based television system which used interlacing to achieve an image resolution of 100 lines.

Also in 1927, Herbert E. Ives of Bell Labs transmitted moving images from a 50-aperture disk producing 16 frames per minute over a cable from Washington, DC to New York City, and via radio from Whippany, New Jersey. Ives used viewing screens as large as 24 by 30 inches (60 by 75 centimeters). His subjects included Secretary of Commerce Herbert Hoover.

In 1927, Philo Farnsworth made the world's first working television system with electronic scanning of both the pickup and display devices,[9] which he first demonstrated to the press on 1 September 1928.[9][10]

The first practical use of television was in Germany. Regular television broadcasts began in Germany in 1929 and in 1936 the Olympic Games in Berlin were broadcast to television stations in Berlin and Leipzig where the public could view the games live.[11]

In 1936, Kálmán Tihanyi described the principle of plasma television, the first flat panel system.[12][13]

Short Description about a Computer.


A computer is a programmable machine that receives input, stores and manipulates data, and provides output in a useful format.

Although mechanical examples of computers have existed through much of recorded human history, the first electronic computers were developed in the mid-20th century (1940–1945). These were the size of a large room, consuming as much power as several hundred modern personal computers (PCs).[1] Modern computers based on integrated circuits are millions to billions of times more capable than the early machines, and occupy a fraction of the space.[2] Simple computers are small enough to fit into small pocket devices, and can be powered by a small battery. Personal computers in their various forms are icons of the Information Age and are what most people think of as "computers". However, the embedded computers found in many devices from MP3 players to fighter aircraft and from toys to industrial robots are the most numerous.

The ability to store and execute lists of instructions called programs makes computers extremely versatile, distinguishing them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a certain minimum capability is, in principle, capable of performing the same tasks that any other computer can perform. Therefore computers ranging from a netbook to a supercomputer are all able to perform the same computational tasks, given enough time and storage capacity.

History of Sri Lanka


The chronicle records and archaeological discoveries of human beings and their events which happened in an area known as Sri Lanka is called the History of Sri Lanka. The number of archaeological evidences and chronicles written by Sri Lankans and non-Sri Lankans, exploring the history of more than 10,000 years.

The archaeological discovery of the Balangoda Man providing the evidences of a 30,000 years past civilization. With the famous chronicles of Sri Lanka, the Mahawansa, the Dipavamsa, the Culavamsa and the Rajaveliya which has the recorded Sri Lankan history from the beginnings of the Sinhalese monarchy in the 6th century BC to the arrival of European Colonialists in the sixteenth century, up until the disestablishment of the monarchy in 1815. There are some historical records about the country also included in the famous Indian chronicles of sage Valmiki's Ramayana, Mahabharata and the ancient books of Gautama Buddha's teachings.

The period after sixteenth century, some coastal areas of the country was ruled by the Portuguese, Dutch and British. After the year 1815 the entire nation was ruled by the British Colonialists until the political independence granted in 1948 and becomes a sovereign state after 1972. The Sri Lankan people's armed uprisings happened against the British colonial rule in 1818 Uva Rebellion and in 1848 Matale Rebellion.

The new constitution was introduced in 1978 the Executive President as the head of state, was happened after the armed youth uprising in 1971 known as 1971 April Rebellion. The Sri Lankan Civil War started in 1983 and again another armed youth uprising happened in 1987-89 period and the 26 year civil war ended in year 2009.

The significant cultural changes happened after introducing the Buddhism in 3rd century BC by Arhath Mahinda (was the son of Indian emperor Ashoka the Great), after sixteenth century arrival of European Colonialists and after 1977 the new open economic policies also changed the cultural values in the country.

History of Science (by wikipedia)


While empirical investigations of the natural world have been described since antiquity (for example, by Aristotle, Theophrastus and Pliny the Elder), and scientific methods have been employed since the Middle Ages (for example, by Ibn al-Haytham, Abu Rayhan Biruni and Roger Bacon), the dawn of modern science is generally traced back to the early modern period during what is known as the Scientific Revolution of the 16th and 17th centuries.[5]

The word "science" comes through the Old French, and is derived in turn from the Latin scientia, "knowledge", the nominal form of the verb scire, "to know". The Proto-Indo-European (PIE) root that yields scire is *skei-, meaning to "cut, separate, or discern".[6] Similarly, the Greek word for science is 'επιστήμη', deriving from the verb 'επίσταμαι', 'to know'. From the Middle Ages to the Enlightenment, science or scientia meant any systematic recorded knowledge.[7] Science therefore had the same sort of very broad meaning that philosophy had at that time. In other languages, including French, Spanish, Portuguese, and Italian, the word corresponding to science also carries this meaning.

Prior to the 1700s, the preferred term for the study of nature was natural philosophy, while English speakers most typically referred to other philosophical disciplines (such as logic, metaphysics, epistemology, ethics and aesthetics) as moral philosophy. Today, "moral philosophy" is more-or-less synonymous with "ethics". Far into the 1700s, science and natural philosophy were not quite synonymous, but only became so later with the direct use of what would become known formally as the scientific method. By contrast, the word "science" in English was still used in the 17th century (1600s) to refer to the Aristotelian concept of knowledge which was secure enough to be used as a sure prescription for exactly how to do something. In this differing sense of the two words, the philosopher John Locke wrote disparagingly in 1690 that "natural philosophy [the study of nature] is not capable of being made a science".[8]

Locke was to be proven wrong, however. By the early 1800s, natural philosophy had begun to separate from philosophy, though it often retained a very broad meaning. In many cases, science continued to stand for reliable knowledge about any topic, in the same way it is still used in the broad sense (see the introduction to this article) in modern terms such as library science, political science, and computer science. In the more narrow sense of science, as natural philosophy became linked to an expanding set of well-defined laws (beginning with Galileo's laws, Kepler's laws, and Newton's laws for motion), it became more popular to refer to natural philosophy as natural science. Over the course of the nineteenth century, moreover, there was an increased tendency to associate science with study of the natural world (that is, the non-human world). This move sometimes left the study of human thought and society (what would come to be called social science) in a linguistic limbo by the end of the century and into the next.[9]

Through the 1800s, many English speakers were increasingly differentiating science (i.e., the natural sciences) from all other forms of knowledge in a variety of ways. The now-familiar expression “scientific method,” which refers to the prescriptive part of how to make discoveries in natural philosophy, was almost unused until then, but became widespread after the 1870s, though there was rarely total agreement about just what it entailed.[9] The word "scientist," meant to refer to a systematically-working natural philosopher, (as opposed to an intuitive or empirically-minded one) was coined in 1833 by William Whewell.[10] Discussion of scientists as a special group of people who did science, even if their attributes were up for debate, grew in the last half of the 19th century.[9] Whatever people actually meant by these terms at first, they ultimately depicted science, in the narrow sense of the habitual use of the scientific method and the knowledge derived from it, as something deeply distinguished from all other realms of human endeavor.

By the twentieth century (1900s), the modern notion of science as a special kind of knowledge about the world, practiced by a distinct group and pursued through a unique method, was essentially in place. It was used to give legitimacy to a variety of fields through such titles as "scientific" medicine, engineering, advertising, or motherhood.[9] Over the 1900s, links between science and technology also grew increasingly strong.

Richard Feynman described science in the following way for his students: "The principle of science, the definition, almost, is the following: The test of all knowledge is experiment. Experiment is the sole judge of scientific 'truth'. But what is the source of knowledge? Where do the laws that are to be tested come from? Experiment, itself, helps to produce these laws, in the sense that it gives us hints. But also needed is imagination to create from these hints the great generalizations — to guess at the wonderful, simple, but very strange patterns beneath them all, and then to experiment to check again whether we have made the right guess." Feynman also observed, "...there is an expanding frontier of ignorance...things must be learned only to be unlearned again or, more likely, to be corrected."[11]

Wednesday, March 17, 2010


The best six doctors anywhere
And no one can deny it
Are Sunshine,Water,Rest and Air

And Exercise,and Diet,
These six will gladly attend
If you are only willing
Yours ills they'll mend,
Yours cares they'll tend,
And charge you not a shilling.