Mans Best Invention Essay Wikipedia Encyclopedia

Sir Timothy John Berners-LeeOMKBEFRSFREngFRSAFBCS (born 8 June 1955),[1] also known as TimBL, is an English engineer and computer scientist, best known as the inventor of the World Wide Web. He is currently a professor of Computer Science at the University of Oxford.[3] He made a proposal for an information management system in March 1989,[4] and he implemented the first successful communication between a Hypertext Transfer Protocol (HTTP) client and server via the internet in mid-November the same year.[5][6][7][8][9]

Berners-Lee is the director of the World Wide Web Consortium (W3C), which oversees the continued development of the Web. He is also the founder of the World Wide Web Foundation and is a senior researcher and holder of the founders chair at the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL).[10] He is a director of the Web Science Research Initiative (WSRI),[11] and a member of the advisory board of the MIT Center for Collective Intelligence.[12][13] In 2011, he was named as a member of the board of trustees of the Ford Foundation.[14] He is a founder and president of the Open Data Institute.

In 2004, Berners-Lee was knighted by Queen Elizabeth II for his pioneering work.[15][16] In April 2009, he was elected a foreign associate of the United States National Academy of Sciences.[17][18] Named in Time magazine's list of the 100 Most Important People of the 20th century, Berners-Lee has received a number of other accolades for his invention.[19] He was honoured as the "Inventor of the World Wide Web" during the 2012 Summer Olympics opening ceremony, in which he appeared in person, working with a vintage NeXT Computer at the London Olympic Stadium.[20] He tweeted "This is for everyone", which instantly was spelled out in LCD lights attached to the chairs of the 80,000 people in the audience.[20] Berners-Lee received the 2016 Turing Award "for inventing the World Wide Web, the first web browser, and the fundamental protocols and algorithms allowing the Web to scale".[22]

Early life and education

Berners-Lee was born in London, England, United Kingdom,[23] one of four children born to Mary Lee Woods and Conway Berners-Lee. His parents worked on the first commercially built computer, the Ferranti Mark 1. He attended Sheen Mount Primary School, and then went on to attend south west London's Emanuel School from 1969 to 1973, at the time a direct grant grammar school, which became an independent school in 1975.[1][15] A keen trainspotter as a child, he learnt about electronics from tinkering with a model railway.[24] He studied at The Queen's College, Oxford, from 1973 to 1976, where he received a first-classbachelor of arts degree in physics.[1][23]

Career

After graduation, Berners-Lee worked as an engineer at the telecommunications company Plessey in Poole, Dorset.[23] In 1978, he joined D. G. Nash in Ferndown, Dorset, where he helped create type-setting software for printers.[23]

Berners-Lee worked as an independent contractor at CERN from June to December 1980. While in Geneva, he proposed a project based on the concept of hypertext, to facilitate sharing and updating information among researchers.[25] To demonstrate it, he built a prototype system named ENQUIRE.[26]

After leaving CERN in late 1980, he went to work at John Poole's Image Computer Systems, Ltd, in Bournemouth, Dorset.[27] He ran the company's technical side for three years.[28] The project he worked on was a "real-timeremote procedure call" which gave him experience in computer networking.[27] In 1984, he returned to CERN as a fellow.[26]

In 1989, CERN was the largest internet node in Europe, and Berners-Lee saw an opportunity to join hypertext with the internet:

"I just had to take the hypertext idea and connect it to the Transmission Control Protocol and domain name system ideas and—ta-da!—the World Wide Web[29] ... Creating the web was really an act of desperation, because the situation without it was very difficult when I was working at CERN later. Most of the technology involved in the web, like the hypertext, like the internet, multifont text objects, had all been designed already. I just had to put them together. It was a step of generalising, going to a higher level of abstraction, thinking about all the documentation systems out there as being possibly part of a larger imaginary documentation system."[30]

Berners-Lee wrote his proposal in March 1989 and, in 1990, redistributed it. It then was accepted by his manager, Mike Sendall.[31] He used similar ideas to those underlying the ENQUIRE system to create the World Wide Web, for which he designed and built the first Web browser. His software also functioned as an editor (called WorldWideWeb, running on the NeXTSTEP operating system), and the first Web server, CERN HTTPd (short for Hypertext Transfer Protocol daemon).

"Mike Sendall buys a NeXT cube for evaluation, and gives it to Tim [Berners-Lee]. Tim's prototype implementation on NeXTStep is made in the space of a few months, thanks to the qualities of the NeXTStep software development system. This prototype offers WYSIWYG browsing/authoring! Current Web browsers used in 'surfing the internet' are mere passive windows, depriving the user of the possibility to contribute. During some sessions in the CERN cafeteria, Tim and I try to find a catching name for the system. I was determined that the name should not yet again be taken from Greek mythology..... Tim proposes 'World-Wide Web'. I like this very much, except that it is difficult to pronounce in French..." by Robert Cailliau, 2 November 1995.[32]

The first web site was built at CERN. Despite this being an international organisation hosted by Switzerland, the office that Berners-Lee used was just across the border in France.[33] It was put online on 6 August 1991 for the first time:

info.cern.ch was the address of the world's first-ever web site and web server, running on a NeXT computer at CERN. The first web page address was http://info.cern.ch/hypertext/WWW/TheProject.html, which centred on information regarding the WWW project. Visitors could learn more about hypertext, technical details for creating their own webpage, and even an explanation on how to search the Web for information. There are no screenshots of this original page and, in any case, changes were made daily to the information available on the page as the WWW project developed. You may find a later copy (1992) on the World Wide Web Consortium website.[34]

It provided an explanation of what the World Wide Web was, and how one could use a browser and set up a web server.[35][36][37][38] In a list of 80 cultural moments that shaped the world, chosen by a panel of 25 eminent scientists, academics, writers, and world leaders, the invention of the World Wide Web was ranked number one, with the entry stating, "The fastest growing communications medium of all time, the internet has changed the shape of modern life forever. We can connect with each other instantly, all over the world".[39]

In 1994, Berners-Lee founded the W3C at the Massachusetts Institute of Technology. It comprised various companies that were willing to create standards and recommendations to improve the quality of the Web. Berners-Lee made his idea available freely, with no patent and no royalties due. The World Wide Web Consortium decided that its standards should be based on royalty-free technology, so that they easily could be adopted by anyone.[40]

In 2001, Berners-Lee became a patron of the East Dorset Heritage Trust, having previously lived in Colehill in Wimborne, East Dorset.[41] In December 2004, he accepted a chair in computer science at the School of Electronics and Computer Science, University of Southampton, Hampshire, to work on the Semantic Web.[42][43]

In a Times article in October 2009, Berners-Lee admitted that the initial pair of slashes ("//") in a web address were "unnecessary". He told the newspaper that he easily could have designed web addresses without the slashes. "There you go, it seemed like a good idea at the time", he said in his lighthearted apology.[44]

Recent work

In June 2009, then-British Prime Minister Gordon Brown announced Berners-Lee would work with the UK government to help make data more open and accessible on the Web, building on the work of the Power of Information Task Force.[45] Berners-Lee and Professor Nigel Shadbolt are the two key figures behind data.gov.uk, a UK government project to open up almost all data acquired for official purposes for free re-use. Commenting on the opening up of Ordnance Survey data in April 2010, Berners-Lee said that: "The changes signal a wider cultural change in government based on an assumption that information should be in the public domain unless there is a good reason not to—not the other way around." He went on to say: "Greater openness, accountability and transparency in Government will give people greater choice and make it easier for individuals to get more directly involved in issues that matter to them."[46]

In November 2009, Berners-Lee launched the World Wide Web Foundation in order to "advance the Web to empower humanity by launching transformative programs that build local capacity to leverage the Web as a medium for positive change."[47]

Berners-Lee is one of the pioneer voices in favour of net neutrality,[48] and has expressed the view that ISPs should supply "connectivity with no strings attached", and should neither control nor monitor the browsing activities of customers without their expressed consent.[49][50] He advocates the idea that net neutrality is a kind of human network right: "Threats to the internet, such as companies or governments that interfere with or snoop on internet traffic, compromise basic human network rights."[51] Berners-Lee participated in an open letter to the US Federal Communications Commission (FCC). He and 20 other Internet pioneers urged the FCC to cancel a vote on 14 December 2017 to uphold net neutrality. The letter was addressed to Senator Roger Wicker, Senator Brian Schatz, Representative Marsha Blackburn and Representative Michael F. Doyle.[52]

Berners-Lee joined the board of advisors of start-up State.com, based in London.[53] As of May 2012, Berners-Lee is president of the Open Data Institute,[54] which he co-founded with Nigel Shadbolt in 2012.

The Alliance for Affordable Internet (A4AI) was launched in October 2013 and Berners-Lee is leading the coalition of public and private organisations that includes Google, Facebook, Intel, and Microsoft. The A4AI seeks to make internet access more affordable so that access is broadened in the developing world, where only 31% of people are online. Berners-Lee will work with those aiming to decrease internet access prices so that they fall below the UN Broadband Commission's worldwide target of 5% of monthly income.[55]

Berners-Lee holds the founders chair in Computer Science at the Massachusetts Institute of Technology, where he heads the Decentralized Information Group and is leading Solid, a joint project with the Qatar Computing Research Institute that aims to radically change the way Web applications work today, resulting in true data ownership as well as improved privacy.[56] In October 2016, he joined the Department of Computer Science at Oxford University as a professorial research fellow[57] and as a fellow of Christ Church, one of the Oxford colleges.[58]

Personal life

Berners-Lee was married to Nancy Carlson in 1990; they had two children and divorced in 2011. In 2014, Berners-Lee married Rosemary Leith at St. James's Palace in London.[59] Leith is director of the World Wide Web Foundation and a fellow at Harvard University's Berkman Center. Previously, she was World Economic Forum Global Agenda Council Chair of the Future of Internet Security[60] and now is on the board of YouGov.[61]

Berners-Lee was raised as an Anglican, but in his youth, he turned away from religion. After he became a parent, he became a Unitarian Universalist (UU).[62] He has stated: "Like many people, I had a religious upbringing which I rejected as a teenager... Like many people, I came back to religion when we had children".[63] He and his wife wanted to teach spirituality to his children, and after hearing a Unitarian minister and visiting the UU Church, they opted for it.[64] He is an active member of that church,[65] to which he adheres because he perceives it as a tolerant and liberal belief. He has said: "I believe that much of the philosophy of life associated with many religions is much more sound than the dogma which comes along with it. So I do respect them."[63]

Distinctions

Main article: List of awards and honours received by Tim Berners-Lee

"He wove the World Wide Web and created a mass medium for the 21st century. The World Wide Web is Berners-Lee's alone. He designed it. He loosed it on the world. And he more than anyone else has fought to keep it open, nonproprietary and free."

—Tim Berners-Lee's entry in Time magazine's list of the 100 Most Important People of the 20th century, March 1999.[19]

Berners-Lee has received many awards and honours. He was knighted by Queen Elizabeth II in the 2004 New Year Honours "for services to the global development of the internet", and was invested formally on 16 July 2004.[15][16]

On 13 June 2007, he was appointed to the Order of Merit (OM), an order restricted to 24 (living) members.[66] Bestowing membership of the Order of Merit is within the personal purview of the Queen, and does not require recommendation by ministers or the Prime Minister. He was elected a Fellow of the Royal Society (FRS) in 2001.[2] He has been conferred honorary degrees from a number of Universities around the world, including Manchester (his parents worked on the Manchester Mark 1 in the 1940s), Harvard and Yale.[67][68][69]

In 2012, Berners-Lee was among the British cultural icons selected by artist Sir Peter Blake to appear in a new version of his most famous artwork – the Beatles' Sgt. Pepper's Lonely Hearts Club Band album cover – to celebrate the British cultural figures of his life that he most admires to mark his 80th birthday.[70][71]

In 2013, he was awarded the inaugural Queen Elizabeth Prize for Engineering.[72] On 4 April 2017, he received the 2016 ACM Turing Award "for inventing the World Wide Web, the first web browser, and the fundamental protocols and algorithms allowing the Web to scale".[22]

See also

References

  1. ^ abcdBERNERS-LEE, Sir Timothy (John). ukwhoswho.com. Who's Who. 2015 (online Oxford University Press ed.). A & C Black, an imprint of Bloomsbury Publishing plc. (subscription required)
  2. ^ ab"Fellowship of the Royal Society 1660–2015". London: Royal Society. Archived from the original on 15 July 2015. 
  3. ^"Sir Tim Berners-Lee joins Oxford's Department of Computer Science". University of Oxford. 
  4. ^"info.cern.ch – Tim Berners-Lee's proposal". Info.cern.ch. Retrieved 21 December 2011. 
  5. ^Tim Berners Lee's own reference. The exact date is unknown.
  6. ^Berners-Lee, Tim; Mark Fischetti (1999). Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web by its inventor. Britain: Orion Business. ISBN 0-7528-2090-7. 
  7. ^Berners-Lee, T. (2010). "Long Live the Web". Scientific American. 303 (6): 80–85. doi:10.1038/scientificamerican1210-80. PMID 21141362. 
  8. ^Shadbolt, N.; Berners-Lee, T. (2008). "Web science emerges". Scientific American. 299 (4): 76–81. doi:10.1038/scientificamerican1008-76. PMID 18847088. 
  9. ^Berners-Lee, T.; Hall, W.; Hendler, J.; Shadbolt, N.; Weitzner, D. (2006). "Computer Science: Enhanced: Creating a Science of the Web". Science. 313 (5788): 769–771. doi:10.1126/science.1126902. PMID 16902115. 
  10. ^"Draper Prize". Massachusetts Institute of Technology. Retrieved 25 May 2008. 
  11. ^"People". The Web Science Research Initiative. Archived from the original on 28 June 2008. Retrieved 17 January 2011. 
  12. ^"MIT Center for Collective Intelligence (homepage)". Cci.mit.edu. Retrieved 15 August 2010. 
  13. ^"MIT Center for Collective Intelligence (people)". Cci.mit.edu. Archived from the original on 11 June 2010. Retrieved 15 August 2010. 
  14. ^Bratt, Steve (29 September 2011). "Sir Tim Berners-Lee Named to the Ford Foundation Board". World Wide Foundation. Retrieved 22 August 2017. 
  15. ^ abc"Web's inventor gets a knighthood". BBC News. 31 December 2003. Retrieved 10 November 2015. 
  16. ^ ab"Creator of the web turns knight". BBC News. 16 July 2004. Retrieved 10 November 2015. 
  17. ^"Timothy Berners-Lee Elected to National Academy of Sciences". Dr. Dobb's Journal. Retrieved 9 June 2009. 
  18. ^"72 New Members Chosen By Academy" (Press release). United States National Academy of Sciences. 28 April 2009. Retrieved 17 January 2011. 
  19. ^ abQuittner, Joshua (29 March 1999). "Tim Berners Lee—Time 100 People of the Century". Time Magazine.  
  20. ^ abFriar, Karen (28 July 2012). "Sir Tim Berners-Lee stars in Olympics opening ceremony". ZDNet. Retrieved 28 July 2012. 
  21. ^ ab"A. M. Turing Award". Association for Computing Machinery. 2016. Retrieved 4 April 2017. 
  22. ^ abcd"Berners-Lee Longer Biography". World Wide Web Consortium. Retrieved 18 January 2011. 
  23. ^"Lunch with the FT: Tim Berners-Lee". Financial Times. 
  24. ^"Berners-Lee's original proposal to CERN". World Wide Web Consortium. March 1989. Retrieved 25 May 2008. 
  25. ^ abStewart, Bill. "Tim Berners-Lee, Robert Cailliau, and the World Wide Web". Retrieved 22 July 2010. 
  26. ^ abBerners-Lee, Tim. "Frequently asked questions". World Wide Web Consortium. Retrieved 22 July 2010. 
  27. ^Grossman, Wendy (15 July 1996). "All you never knew about the Net ...". The Independent. 
  28. ^Berners-Lee, Tim. "Answers for Young People". World Wide Web Consortium. Retrieved 25 May 2008. 
  29. ^"Biography and Video Interview of Timothy Berners-Lee at Academy of Achievement". Achievement.org. Archived from the original on 1 January 2012. Retrieved 21 December 2011. 
  30. ^"Ten Years Public Domain for the Original Web Software". CERN. Retrieved 21 July 2010. 
  31. ^Gromov, Gregory, Roads and Crossroads of Internet History, Chapter 4: Birth of the Web > 1990.
  32. ^"Tim Berners-Lee. Confirming The Exact Location Where the Web Was Invented". davidgalbraith.org. 8 July 2010. 
  33. ^"The World Wide Web project". cern.ch. Retrieved 29 March 2016. 
  34. ^"Welcome to info.cern.ch, the website of the world's first-ever web server". CERN. Retrieved 25 May 2008. 
  35. ^"World Wide Web—Archive of world's first website". World Wide Web Consortium. Retrieved 25 May 2008. 
  36. ^"World Wide Web—First mentioned on USENET". Google. 6 August 1991. Retrieved 25 May 2008. 
  37. ^"The original post to alt.hypertalk describing the WorldWideWeb Project". Google Groups. Google. 9 August 1991. Retrieved 25 May 2008. 
  38. ^"80 moments that shaped the world". British Council. Archived from the original on 30 June 2016. Retrieved 13 May 2016. 
  39. ^"Patent Policy—5 February 2004". World Wide Web Consortium. 5 February 2004. Retrieved 25 May 2008. 
  40. ^Klooster, John W., (2009), Icons of Invention: the makers of the modern world from Gutenberg to Gates, ABC-CLIO, p. 611.
  41. ^Berners-Lee, T.; Hendler, J.; Lassila, O. (2001). "The Semantic Web". Scientific American. 2841 (5): 34. doi:10.1038/scientificamerican0501-34. 
  42. ^"Tim Berners-Lee, World Wide Web inventor, to join ECS". World Wide Web Consortium. 2 December 2004. Retrieved 25 May 2008. 
  43. ^"Berners-Lee 'sorry' for slashes". BBC. 14 October 2009. Retrieved 14 October 2009. 
  44. ^"Tim Berners-Lee". World Wide Web Consortium. 10 June 2009. Retrieved 10 July 2009. 
  45. ^"Ordnance Survey offers free data access". BBC News. 1 April 2010. Retrieved 3 April 2009. 
  46. ^FAQ—World Wide Web Foundation. Retrieved 18 January 2011.
  47. ^Ghosh, Pallab (15 September 2008). "Web creator rejects net tracking". BBC. Retrieved 15 September 2008.  
  48. ^Cellan-Jones, Rory (March 2008). "Web creator rejects net tracking". BBC. Retrieved 25 May 2008.  
  49. ^Adams, Stephen (March 2008). "Web inventor's warning on spy software". The Daily Telegraph. London. Retrieved 25 May 2008.  
  50. ^Berners, Tim (December 2010). "Long Live the Web: A Call for Continued Open Standards and Neutrality". Scientific American. Retrieved 21 December 2011. 
  51. ^"Vint Cerf, Tim Berners-Lee, and 19 other technologists pen letter asking FCC to save net neutrality". VB News. Retrieved 14 December 2017
  52. ^"State.com/about/people". Archived from the original on 4 March 2016. Retrieved 9 September 2013. 
  53. ^Computing, Government (23 May 2012). "Government commits £10m to Open Data Institute". The Guardian. 
  54. ^Gibbs, Samuel (7 October 2013). "Sir Tim Berners-Lee and Google lead coalition for cheaper internet". The Guardian. Retrieved 8 October 2013. 
  55. ^Weinberger, David, "How the father of the World Wide Web plans to reclaim it from Facebook and Google". Digital Trends, 10 August 2016. Retrieved 31 October 2016.
  56. ^"Sir Tim Berners-Lee joins Oxford's Department of Computer Science". UK: University of Oxford. 27 October 2016. 
  57. ^"Sir Tim Berners-Lee joins Oxford's Department of Computer Science and Christ Church". UK: Christ Church, Oxford. 27 October 2016. Retrieved 14 November 2016. 
  58. ^
This NeXT Computer was used by Berners-Lee at CERN and became the world's first web server
Tim Berners-Lee at the Home Office, London, on 11 March 2010

This article is about the profession. For other uses, see Scientist (disambiguation).

A scientist is a person engaging in a systematic activity to acquire knowledge that describes and predicts the natural world. In a more restricted sense, a scientist may refer to an individual who uses the scientific method.[1] The person may be an expert in one or more areas of science.[2] The term scientist was coined by the theologian, philosopher, and historian of science William Whewell in 1833. This article focuses on the more restricted use of the word. Scientists perform research toward a more comprehensive understanding of nature, including physical, mathematical and social realms.

Philosophy is today typically regarded as a distinct activity from science, though the activities were not always distinguished in this fashion, with science considered a "branch" of philosophy rather than opposed to it, prior to modernity. Philosophers aim to provide a comprehensive understanding of fundamental aspects of reality and experience, often pursuing inquiries with conceptual, rather than empirical, methods. Natural scientific research is usually also distinguished from inquiry in the humanities more generally, and often with inquiry in the social sciences and mathematics on various grounds, although these distinctions may be controversial.

When science is done with a goal toward practical utility, it is called applied science. An applied scientist may not be designing something in particular, but rather is conducting research with the aim of developing new technologies and practical methods. When science seeks to answer questions about fundamental aspects of reality it is sometimes called natural philosophy, as it was generally known before the 19th century.

Description[edit]

Science and technology have continually modified human existence through the engineering process. As a profession the scientist of today is widely recognized. Scientists include theoreticians who mainly develop new models to explain existing data and predict new results, and experimentalists who mainly test models by making measurements — though in practice the division between these activities is not clear-cut, and many scientists perform both tasks.

There is a continuum from the most theoretical to the most empirical scientists with no distinct boundaries. In terms of personality, interests, training and professional activity, there is little difference between applied mathematicians and theoretical physicists.

Scientists can be motivated in several ways. Many have a desire to understand why the world is as we see it and how it came to be. They exhibit a strong curiosity about reality. Other motivations are recognition by their peers and prestige, or the desire to apply scientific knowledge for the benefit of people's health, the nations, the world, nature or industries (academic scientist and industrial scientist). Scientists tend to be less motivated by direct financial reward for their work than other careers. As a result, scientific researchers often accept lower average salaries when compared with many other professions which require a similar amount of training and qualification.[3]

Demography[edit]

By country[edit]

The number of scientists is vastly different from country to country. For instance, there are only 4 full-time scientists per 10,000 workers in India while this number is 79 for the United Kingdom and the United States.[4]

United States[edit]

According to the United States National Science Foundation 4.7 million people with science degrees worked in the United States in 2015, across all disciplines and employment sectors. The figure included twice as many men as women. Of that total, 17% worked in academia, that is, at universities and undergraduate institutions, and men held 53% of those positions. 5% of scientists worked for the federal government and about 3.5% were self-employed. Of the latter two groups, two-thirds were men. 59% of US scientists were employed in industry or business, and another 6% worked in non-profit positions.[5]

By gender[edit]

See also: Women in science

Scientist and engineering statistics are usually intertwined, but they indicate that women enter the field far less than men, though this gap is narrowing. The number of science and engineering doctorates awarded to women rose from a mere 7 percent in 1970 to 34 percent in 1985 and in engineering alone the numbers of bachelor's degrees awarded to women rose from only 385 in 1975 to more than 11000 in 1985.[6][clarification needed]

This inequality follows into the professional setting in terms of both position and income. According to Eisenhart and Finked, women's experiences, even when they have equal qualifications, are that they start in lower positions while men are granted tenure track positions. This later predicts a gender inequality of tenured positions as scientists in universities, "as of 1989, 65 percent of men and only 40 percent of women held tenured positions." Income conflicts occur when median annual salaries for full-time employed civilian scientists are compared, "salary for men is $48,000, and that for women is $42,000."[7]

Historical development and etymology of the term[edit]

See also: Timeline of the history of scientific method and Scientific revolution

Until the late 19th or early 20th century, scientists were called "natural philosophers" or "men of science".[8][9][10][11]

English philosopher and historian of science William Whewell coined the term scientist in 1833, and it first appeared in print in Whewell's anonymous 1834 review of Mary Somerville's On the Connexion of the Physical Sciences published in the Quarterly Review.[12] Whewell's suggestion of the term was partly satirical, a response to changing conceptions of science itself in which natural knowledge was increasingly seen as distinct from other forms of knowledge. Whewell wrote of "an increasing proclivity of separation and dismemberment" in the sciences; while highly specific terms proliferated—chemist, mathematician, naturalist—the broad term "philosopher" was no longer satisfactory to group together those who pursued science, without the caveats of "natural" or "experimental" philosopher. Members of the British Association for the Advancement of Science had been complaining about the lack of a good term at recent meetings, Whewell reported in his review; alluding to himself, he noted that "some ingenious gentleman proposed that, by analogy with artist, they might form [the word] scientist, and added that there could be no scruple in making free with this term since we already have such words as economist, and atheist—but this was not generally palatable".[13]

Whewell proposed the word again more seriously (and not anonymously) in his 1840[14] "The Philosophy of the Inductive Sciences:

As we cannot use physician for a cultivator of physics, I have called him a physicist. We need very much a name to describe a cultivator of science in general. I should incline to call him a Scientist. Thus we might say, that as an Artist is a Musician, Painter, or Poet, a Scientist is a Mathematician, Physicist, or Naturalist.

He also proposed the term physicist at the same time, as a counterpart to the French word physicien. Neither term gained wide acceptance until decades later; scientist became a common term in the late 19th century in the United States and around the turn of the 20th century in Great Britain.[12][15][16] By the twentieth century, the modern notion of science as a special brand of information about the world, practiced by a distinct group and pursued through a unique method, was essentially in place.

The social roles of "scientists", and their predecessors before the emergence of modern scientific disciplines, have evolved considerably over time. Scientists of different eras (and before them, natural philosophers, mathematicians, natural historians, natural theologians, engineers, and others who contributed to the development of science) have had widely different places in society, and the social norms, ethical values, and epistemic virtues associated with scientists—and expected of them—have changed over time as well. Accordingly, many different historical figures can be identified as early scientists, depending on which elements of modern science are taken to be essential.

Some historians point to the 17th century as the period when science in a recognizably modern form developed (what is popularly called the Scientific Revolution). It wasn't until the 19th century that sufficient socioeconomic changes occurred for scientists to emerge as a major profession.[18]

Ancient and medieval science[edit]

Knowledge about nature in Classical Antiquity was pursued by many kinds of scholars. Greek contributions to science—including works of geometry and mathematical astronomy, early accounts of biological processes and catalogs of plants and animals, and theories of knowledge and learning—were produced by philosophers and physicians, as well as practitioners of various trades. These roles, and their associations with scientific knowledge, spread with the Roman Empire and, with the spread of Christianity, became closely linked to religious institutions in most of European countries. Astrology and astronomy became an important area of knowledge, and the role of astronomer/astrologer developed with the support of political and religious patronage. By the time of the medieval university system, knowledge was divided into the trivium—philosophy, including natural philosophy—and the quadrivium—mathematics, including astronomy. Hence, the medieval analogs of scientists were often either philosophers or mathematicians. Knowledge of plants and animals was broadly the province of physicians.

Science in medieval Islam generated some new modes of developing natural knowledge, although still within the bounds of existing social roles such as philosopher and mathematician. Many proto-scientists from the Islamic Golden Age are considered polymaths, in part because of the lack of anything corresponding to modern scientific disciplines. Many of these early polymaths were also religious priests and theologians: for example, Alhazen and al-Biruni were mutakallimiin; the physician Avicenna was a hafiz; the physician Ibn al-Nafis was a hafiz, muhaddith and ulema; the botanist Otto Brunfels was a theologian and historian of Protestantism; the astronomer and physician Nicolaus Copernicus was a priest. During the Italian Renaissance scientists like Leonardo Da Vinci, Michelangelo, Galileo Galilei and Gerolamo Cardano have been considered as the most recognizable polymaths.

Historical scientists[edit]

During the Renaissance, Italians made substantial contributions in science. Leonardo Da Vinci made significant discoveries in paleontology and anatomy. The Father of modern Science,[19][20]Galileo Galilei, made key improvements on the thermometer and telescope which allowed him to observe and clearly describe the solar system. Descartes was not only a pioneer of analytic geometry but formulated a theory of mechanics[21] and advanced ideas about the origins of animal movement and perception. Vision interested the physicistsYoung and Helmholtz, who also studied optics, hearing and music. Newton extended Descartes' mathematics by inventing calculus (contemporaneously with Leibniz). He provided a comprehensive formulation of classical mechanics and investigated light and optics. Fourier founded a new branch of mathematics — infinite, periodic series — studied heatflow and infrared radiation, and discovered the greenhouse effect. Girolamo Cardano, Blaise PascalPierre de Fermat, Von Neumann, Turing, Khinchin, Markov and Wiener, all mathematicians, made major contributions to science and probability theory, including the ideas behind computers, and some of the foundations of statistical mechanics and quantum mechanics. Many mathematically inclined scientists, including Galileo, were also musicians.

Luigi Galvani, the pioneer of the bioelectromagnetics, discovered the animal electricity. He discovered that a charge applied to the spinal cord of a frog could generate muscular spasms throughout its body. Charges could make frog legs jump even if the legs were no longer attached to a frog. While cutting a frog leg, Galvani's steel scalpel touched a brass hook that was holding the leg in place. The leg twitched. Further experiments confirmed this effect, and Galvani was convinced that he was seeing the effects of what he called animal electricity, the life force within the muscles of the frog. At the University of Pavia, Galvani's colleague Alessandro Volta was able to reproduce the results, but was sceptical of Galvani's explanation.[22]

During the age of Enlightenment, Francesco Redi, discovered that microorganisms can cause disease. This was later explained by Louis Pasteur. There are many compelling stories in medicine and biology, such as the development of ideas about the circulation of blood from Galen to Harvey. The flowering of genetics and molecular biology in the 20th century is replete with famous names. Ramón y Cajal won the Nobel Prize in 1906 for his remarkable observations in neuroanatomy.

Marie Curie became the first female to win the Nobel Prize and the first person to win it twice. Her efforts led to the development of nuclear energy and Radio therapy for the treatment of cancer. In 1922, she was appointed a member of the International Commission on Intellectual Co-operation by the Council of the League of Nations. She campaigned for scientist's right to patent their discoveries and inventions. She also campaigned for free access to international scientific literature and for internationally recognized scientific symbols.

Lazzaro Spallanzani is one of the most influential figures in experimental physiology and the natural sciences. His investigations have exerted a lasting influence on the medical sciences. He made important contributions to the experimental study of bodily functions and animal reproduction.[23]

Some see a dichotomy between experimental sciences and purely "observational" sciences such as astronomy, meteorology, oceanography and seismology. But astronomers have done basic research in optics, developed charge-coupled devices, and in recent decades have sent space probes to study other planets in addition to using the Hubble Telescope to probe the origins of the Universe some 14 billion years ago. Microwave spectroscopy has now identified dozens of organic molecules in interstellar space, requiring laboratory experimentation and computer simulation to confirm the observational data and starting a new branch of chemistry. Computer modeling and numerical methods are techniques required of students in every field of quantitative science.

Types of scientists[edit]

Those considering science as a career often look to the frontiers. These include cosmology and biology, especially molecular biology and the human genome project. Other areas of active research include the exploration of matter at the scale of elementary particles as described by high-energy physics, and materials science, which seeks to discover and design new materials. Although there have been remarkable discoveries with regard to brain function and neurotransmitters, the nature of the mind and humanthought still remains unknown.

By field[edit]

By employer[edit]

See also[edit]

Related lists

References[edit]

External articles[edit]

Further reading
  • Alison Gopnik, "Finding Our Inner Scientist", Daedalus, Winter 2004.
  • Charles George Herbermann, The Catholic Encyclopedia. Science and the Church. The Encyclopedia press, 1913. v.13. Page 598.
  • Thomas Kuhn, The Structure of Scientific Revolutions, 1962.
  • Arthur Jack Meadows. The Victorian Scientist: The Growth of a Profession, 2004. ISBN 0-7123-0894-6.
  • Science, The Relation of Pure Science to Industrial Research. American Association for the Advancement of Science. Page 511 onwards.
Websites
"No one in the history of civilization has shaped our understanding of science and natural philosophy more than the great Greek philosopher and scientist Aristotle (384-322 BC), who exerted a profound and pervasive influence for more than two thousand years" —Gary B. Ferngren[17]
Francesco Redi, referred as the Father of modern parasitology, is the founder of experimental biology.
Physicist Albert Einstein developed the general theory of relativity and made many substantial contributions to physics
Physicist Enrico Fermi is credited with the creation of the world's first atomic bomb and nuclear reactor.
Atomic physicist Niels Bohr, made fundamental contributions to understanding atomic structure and quantum theory
Marine Biologist Rachel Carson launched the 20th century environmental movement.
  1. ^Isaac Newton (1687, 1713, 1726). "[4] Rules for the study of natural philosophy", Philosophiae Naturalis Principia Mathematica, Third edition. The General Scholium containing the 4 rules follows Book 3, The System of the World. Reprinted on pages 794-796 of I. Bernard Cohen and Anne Whitman's 1999 translation, University of California PressISBN 0-520-08817-4, 974 pages.
  2. ^Oxford English Dictionary, 2nd ed. 1989
  3. ^http://www.thisismoney.co.uk/money/article-2269520/Best-paid-jobs-2012-Official-figures-national-average-UK-salaries-400-occupations.html
  4. ^ abRichard van Noorden (2015) India by the numbers. Nature 521: 142-143 (14 May 2015).
  5. ^"Employment: Male majority". Nature. 542 (7642): 509–509. 2017-02-22. doi:10.1038/nj7642-509b. 
  6. ^Margaret A. Eisenhart, Elizabeth Finkel (1998). Women's Science: Learning and Succeeding from the Margins. University of Chicago Press. p. 18. 
  7. ^Eisenhart and Finkel, Ch 1 in The Gender and Science Reader ed. Muriel Lederman and Ingrid Bartsch. New York, Routledge, 2001. (16-17)
  8. ^Nineteenth-Century Attitudes: Men of Science. http://www.rpi.edu/~rosss2/book.html
  9. ^Friedrich Ueberweg, History of Philosophy: From Thales to the Present Time. C. Scribner's sons v.1, 1887
  10. ^Steve Fuller, Kuhn VS. Popper: The Struggle For The Soul Of Science. Columbia University Press 2004. Page 43. ISBN 0-231-13428-2
  11. ^Science by American Association for the Advancement of Science, 1917. v.45 1917 Jan-Jun. Page 274.
  12. ^ abRoss, Sydney (1962). "Scientist: The story of a word"(PDF). Annals of Science. 18 (2): 65–85. doi:10.1080/00033796200202722. Retrieved 2011-03-08.  To be exact, the person coined the term scientist was referred to in Whewell 1834 only as "some ingenious gentleman." Ross added a comment that this "some ingenious gentleman" was Whewell himself, without giving the reason for the identification. Ross 1962, p.72.
  13. ^Holmes, R (2008). The age of wonder: How the romantic generation discovered the beauty and terror of science. London: Harper Press. p. 449. ISBN 978-0-00-714953-7. 
  14. ^ abWhewell, William. The Philosophy of the Inductive Sciences Volume 1. Cambridge: John W Parker J&J Deighton. p. cxiii. [1]. In the 1847 second edition, moved to volume 2 page 560.
  15. ^"William Whewell (1794-1866) gentleman of science". Retrieved 2007-05-19. 
  16. ^Tamara Preaud, Derek E. Ostergard, The Sèvres Porcelain Manufactory. Yale University Press 1997. 416 pages. ISBN 0-300-07338-0 Page 36.
  17. ^Gary B. Ferngren (2002). "Science and religion: a historical introduction". JHU Press. p.33. ISBN 0-8018-7038-0
  18. ^On the historical development of the character of scientists and the predecessors, see: Steven Shapin (2008). The Scientific Life: A Moral History of a Late Modern Vocation. Chicago: Chicago University Press. ISBN 0-226-75024-8
  19. ^Einstein (1954, p. 271). "Propositions arrived at by purely logical means are completely empty as regards reality. Because Galileo realised this, and particularly because he drummed it into the scientific world, he is the father of modern physics—indeed, of modern science altogether."
  20. ^Stephen Hawking, Galileo and the Birth of Modern ScienceArchived 2012-03-24 at the Wayback Machine., American Heritage's Invention & Technology, Spring 2009, Vol. 24, No. 1, p. 36
  21. ^Peter Damerow (2004). "Introduction". Exploring the Limits of Preclassical Mechanics: A Study of Conceptual Development in Early Modern Science: Free Fall and Compounded Motion in the Work of Descartes, Galileo and Beeckman. Springer Science & Business Media. p. 6. 
  22. ^Robert Routledge (1881). A popular history of science (2nd ed.). G. Routledge and Sons. p. 553. ISBN 0-415-38381-1. 
  23. ^"Spallanzani - Uomo e scienziato" (in Italian). Il museo di Lazzaro Spallanzani. Archived from the original on 2010-06-03. Retrieved 2010-06-07. 

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *