We take technology for granted today, but how does it affect companies like NASA? Check out the video
- Shmoop Editorial Team. (2008, November 11). The 1950s . Retrieved July 3, 2014 from http://www.shmoop.com/1950s/
- In this article, there is a summation of how the 1950s affected the population. There are analysis’ of diplomacy, politics, economy, society, culture, and most importantly, technology and science. The most important topic is technology and science as it made the future more innovative.
- Nugget: “We take computers so much for granted today that it’s hard even to imagine a time when they didn’t exist. The closest thing to a computer in 1950 was the Electrical Numerical Integrator And Calculator or ENIAC. Constructed out of 18,000 vacuum tubes and consuming about 180,000 watts of electrical power, the ENIAC was capable of multiplying numbers rapidly… and not much more.”
Checking out the statistics of this computer, it was very slow. I can’t believe that a computer use to be a ginormous brick that was capable of practically nothing. Today when we use computers, we play games, surf the internet, send emails, develop applications, and much more. With many programming languages out there today, there was only one at the time of the creation of a computer.
- Nugget: “Though television had been invented in the 1930s, few Americans had watched a TV show even into the late 1940s. But by the end of the Fifties, TVs were present in 90% of homes and watching television was the favorite leisure activity of nearly half the population.”
Before the creation of the television, families would gather around in the living room around a massive radio. This radio to present sitcoms, shows over the air, and give news to the public. The television revolutionized technology and changed the media forever. Now the public could actually see what was going on, instead of hearing it like they previously could.
- Oxford, T. (2009, August 5). 5 Technologies to thank the 1950s For . . Retrieved July 1, 2014, from http://www.techradar.com/news/world-of-tech/5-technologies-to-thank-the-1950s-for-623013
- The middle of the 20th century was one of the most technologically advanced and innovative era. This time period defined how technologically would change over the years. Microchips and stored computers were two innovative technological advancements of the time that lead to inventions such as the smart phones, laptops, and desktops that today are slowly growing out.
- Nugget: “Looking a lot like the slightly dishevelled uncle that gets drunk at family parties, the first microchip bears little resemblance to its modern equivalent. Jack Kilby of Texas Instruments and Robert Noyce of the Fairchild Semiconductor Corporation (he also co-founded Intel) are credited as being co-founders of the first integrated circuit, in spite of the fact that their creations were six months apart.
Where Jack Kilby managed to develop the first working model in 1958, Robert Noyce’s version had some necessary improvements – such as the use of silicon instead of germanium and interconnecting the components efficiently. The first commercially viable microchips were released by the Fairchild Semiconductor Corporation in 1961 and were about the size of your baby finger.
They consisted of one transistor, three resistors and one capacitor, a far cry from the tiny chip of today that can hold 125 million transistors.”
The microchip is a tiny wafer of semiconducting material used to make an integrated circuit. A microchip is a semiconductor integrated circuit. The function depends on what it is designed to do. It could be a micro processor, memory chip, or digital tuner. It could be used in your wristwatch, microwave oven, cell phone, garage door opener, the space shuttle, or almost anything. When it was first created, it was used for small purposes. Today, they’re implanted in animals, computers, calculators, laptops, and even more.
- Nugget: While these weren’t very good, they did kick-start development. It was Kapany who coined the term fibre optics in 1956 but it was van Heel who discovered that, by covering the bare fibre/glass/plastic with a transparent cladding, contamination and crosstalk were greatly reduced.
Then, in the late 1950s, Lawrence Curtiss improved on this even further by introducing glass clad fibres. The invention of the laser in 1960 heralded steady advancement in fibre optic communications, with the semiconductor laser, developed in 1962, still being the most widely used today.
Everyone today knows about Verizon Fiber Optics, a high speed network for television, internet, and phone services. The first design for Fiber Optics was a failure, as it was very slow and a problem, as it was first developed in the 1840s. After a hundred years, there was a massive development. Imaging bundles were just a start, and now today there’s more to just imaging. Link 3:
- Oxford, T. (2009, November 19). 6 technologies to thank the 1960s for. TechRadar. Retrieved July 2, 2014, from http://www.techradar.com/news/world-of-tech/6-technologies-to-thank-the-1960s-for-650980
- The 1960s built upon the decade before. The first video game console, the mouse for a computer, light emitting diodes, and more. As the decades go on, the technology significantly increases as well. The implentation of UNIX definitely send a huge message across the technological world.
- Nugget: “In 1969 a group of employees from AT&T at Bell Labs created one of the most popular and powerful operating systems of the age, UNIX. Ken Thompson, Dennis Ritchie, Brian Kernighan, Douglas Mcllroy and Joe Ossana were among the crew who sat down to develop UNIX on the PDP-7. The name was derived from MULTICS, a project run in conjunction with several large companies including Bell Labs that failed to deliver on expectations.”
Unix is a multitasking, multiuser computer operating system that exists in many variants. The OS provides a set of simple tools that each perform a limited, well-defined function, with a unified filesystem as the main method of communication. For programmers, this was a huge advancement because UNIX exploded across computers and many programmers took advantage and made many programs to innovate newer languages.
- Nugget: “While not technically the man who invented RAM, Robert Dennard was the man who redesigned and modified it to create Dynamic Random Access Memory (DRAM). His insights into how RAM could function more efficiently over a smaller space mean that computers got more memory for less cost and, frankly, took up less space.”
Random Access Memory (RAM) is a type of computer memory that can be accessed randomly; that is any byte of memory can be accessed without touching the preceding bytes. RAM is the most common type of memory found in computers and other devices, such as printers. There are two kinds of RAM: Dynamic Random Access Memory, and Static Random Access Memory. A type of physical memory used in most personal computers. The term dynamic indicates that the memory must be constantly refreshed, or reenergized or it will lose its contents. SRAM is a type of memory that is faster and more reliable than the more common DRAM. The term statis is derived from the fact that it doesn’t need to be refreshed like a dynamic RAM.
Synthesis: All of these nuggets all relate to each other, and these articles as well. Technology doesn’t just refer to computers or any kind of machines. Technology was the invention of the wheel, the axel, the discovery of an atom. But these articles all show a timeline of how such small inventions led to massive explorations in the field. One small chip located in a spaceshuttle requires a lot of programming, which started with UNIX. All of these innovations are connected, and are constantly improving today.
While finding tweets seemed like a difficult task at hand, it actually turned out to be a good experience. I am a regular user of twitter, posting thoughts and voicing my opinions on the regular.
Finding people who have some kind of relativity to my post was not as bad as I thought it was going to be. I found people talking about technology and security a lot. For what I am studying and researching, security and the use of technology is a huge topic, especially with me having to dig deep for security and research during the Cold War. Twitter is a use of an application that can be so beneficial for communication, research, and connecting with individuals about anything.
Receiving feedback, I was kind of worried if I would get any. The reason was that I thought my topic was out in the ballpark but it’s surprising how they’re close.
The last thing I imagined was someone sending me a tweet with an article explaining to me about my topic to help me out with research. This is because I saw a lot of people interacting with each other and I didn’t know if my post would be relevant.
At first, I thought that security wasn’t going to be a big issue until I was interacting with Symone Allen. The conversation made me realize that the Cold War had many security issues and alerts that both NASA and the USA were trying to take care of. Security is massive way to understand how technology changed over time, and how the public were able to see what NASA was doing.
In order to fully understand how media has changed over the years, Justin Tubbs and I conversed about our topics. While none of ours the same, we have similarities about how research will be done. How media and technology changed overtime are related. From newspapers, to the internet, to the iphone, to the high-tech plasma and high definition TVs.
Needless to say, I am excited for all this research.
This is an initial summary report of a project taking a new and systematic approach to improving the intellectual effectiveness of the individual human being. It’s more of a detailed conceptual framework that explores the nature of the system composed of the individual and the tools, concepts, and methods that match his basic capabilities to his problems. One of the tools that shows the greatest immediate promise is the computer, when it can be harnessed for direct on-line assistance, integrated with new concepts and methods. These are just a few words of summary that can be gathered from this excerpt. But, there was one nugget that really stood out to me :
Our culture has evolved means for us to organize the little things we can do with our basic capabilities so that we can derive comprehension from truly complex situations, and accomplish the processes of deriving and implementing problem solutions. The ways in which human capabilities are thus extended are here called augmentation means, and we define four basic classes of them:
- “Artifacts–physical objects designed to provide for human comfort, for the manipulation of things or materials, and for the manipulation of symbols.
- Language–the way in which the individual parcels out the picture of his world into the concepts that his mind uses to model that world, and the symbols that he attaches to those concepts and uses in consciously manipulating the concepts (“thinking”).
- Methodology–the methods, procedures, strategies, etc., with which an individual organizes his goal-centered (problem-solving) activity.
- Training–the conditioning needed by the human being to bring his skills in using Means 1, 2, and 3 to the point where they are operationally effective.”
In this excerpt by Douglas C. Engelbart, this really stood out because it really represents us humans today. Artifacts are so important today, whether it’s a calculator, laptop, or television to watch anything we want to.
Then the way that we read, see, and understand such artifacts and what is projected through them will define how we think. If there is a problem and we need it to be solved, we take the artifact, use the language necessary to solve the problem, and then use a procedure for the solution.
A language doesn’t mean Arabic, Hindi, English, Chinese, or a spoken language. It could mean a computer language or the way we think and process.
Finally to put all three together, we must train to receive the most accurate and precise results. “Practice makes perfect” is what I am sure that every parent told and still tell their kids.
Two weeks have already flown by and I have a few topics in mind that I can definitely use. I always like to talk about military topics, and ideas. But of late, space, NASA, extraterrestrial beings, and such kinds of topics have really been interesting. The first research that made me realize I want to study something relating space was when I was brainstorming for my inquiry project. When I was throwing ideas out there I thought out NASA and space.
Creating an interest inventory further pushed my ideas about space. It was also during this time that my dad came home and told me that there was a channel just for NASA. Launches are very minimal these days but fortunately NASA shows some old launches. Why does the media not show any information about what Hubble or ISS is doing? They are always showing some kinds of war or politics. Space exploration is at an all time low, and news is very minimal about such. The works for space exploration revolves around the work ethics between man and machines. Without effort from both sides, there never would have been any kinds of missions, or even creation of satellites for us to use to watch TV, internet, and cellphones. We would have never gotten any images either. It’s crazy how much space has done for us.
Licklider brought up a very good point. It is more clearer where technology needs to go. The essay that Licklider is old, so it is kind of outdated. When I mean outdated I mean that either the goals have been reached, or it is really close. This internet adventure made me see how everything is connected, how topics are related and how the computer makes it easier to relate to similar topics. It is like a family tree, or chronological order. You start with a idea or head, and it branches downwards. But also ideas can be interchangeable. All ideas are connected.
Space has always been a big interest for me. My father works for NASA, so as a kid, I was enlightened about all kinds of aspects about outer space, such as how temperatures can range from -200 degrees Fahrenheit to 400 degrees Fahrenheit. Doing this internet adventure, I found a lot of material that I already knew, but some interesting facts that I had no picked up on before I started my Wikipedia adventure by searching “Black Holes” and this is what followed:
Black Holes http://en.wikipedia.org/wiki/Black_holes
I started with black holes because I was always curious about what other kinds of black holes there are known in space.
Supermassive Black Holes http://en.wikipedia.org/wiki/Supermassive_black_hole
I found more information on Supermassive Black holes, which I never knew that existed.
I had no idea what this was. A quasar is a massive and extremely remote celestial object, emitting exceptionally large amounts of energy, and typically having a starlike image in a telescope. It has been suggested that quasars contain massive black holes and may represent a stage in the evolution of some galaxies.
Milky Way http://en.wikipedia.org/wiki/Milky_Way
The Milky Way Galaxy is also one of the most interesting galaxies in space. Did you know at the center of every spiral galaxy lays a black hole?
Galactic Plane http://en.wikipedia.org/wiki/Galactic_plane
I didn’t know there was a term for this!
Barred Spiral Galaxy http://en.wikipedia.org/wiki/Barred_spiral_galaxy
Magellanic Clouds http://en.wikipedia.org/wiki/Magellanic_Clouds
At first I thought these clouds were magical so I clicked on the link and it lead me to Magellanic Clouds. It is defined as either of the two small galaxies that appear as conspicuous patches of light near the south celestial pole and are companions to the Milky Way galaxy.
I knew what a nebula was but I ever knew what it looked like. So I clicked on the link and the pictures that were displayed were absolutely beautiful.
Solar System http://en.wikipedia.org/wiki/Solar_System
Let me throw a revised nugget at you.
I thought my last nugget was pretty interesting, but maybe I left out some detail that I had wanted to add! The nugget I had chosen is as follows:
“Throughout the period I examined, in short, my “thinking” time was devoted mainly to activities that were essentially clerical or mechanical: searching, calculating, plotting, transforming, determining the logical or dynamic consequences of a set of assumptions or hypotheses, preparing the way for a decision or an insight. Moreover, my choices of what to attempt and what not to attempt were determined to an embarrassingly great extent by considerations of clerical feasibility, not intellectual capability.
The main suggestion conveyed by the findings just described is that the operations that fill most of the time allegedly devoted to technical thinking are operations that can be performed more effectively by machines than by men. Severe problems are posed by the fact that these operations have to be performed upon diverse variables and in unforeseen and continually changing sequences. If those problems can be solved in such a way as to create a symbiotic relation between a man and a fast information-retrieval and data-processing machine, however, it seems evident that the cooperative interaction would greatly improve the thinking process.”
When reading this I came to a conclusion. Technology will not advance without the perseverance and dedication of mankind to put forth the effort to advance technology into the future. Without mankind, there is no technology. This conclusion has pushed me to believe that one day, there will be a huge technological advancement, due to mankind of course, and all will go wrong. This will lead to the takeover of technological beings on Earth and we will all be compromised. Then again this is a theory, but who knows, maybe there will a new version of Frankenstein? So if you’re asking if robots will take over the world, my answer is yes.
Now a lot of people might think differently (and that is totally fine because they might not have watched as many TV shows or movies as I did when I was a kid). But it is always interesting to hear what others have to say about such circumstances.
In What’s for Dinher’s blog, he states a really good point that I never thought about. As human’s, we are developed to think and to solve situations logically (for the most part unless we get carried away). The analysis in that blogged proved to question me that if we did rely in technology so much, will we seriously lose all our thinking capabilities and logical touch? We will end up becoming like the obese people in Wall-E who use technology so much that they don’t care about their body.
When I read MJ’s blog, it definitely connects to the first blog I read. With technology rising, we are thinking less than we should. A simple invention like the calculator makes math calculations so much easier for us. Back when my parents were in school, they had to do calculations all in their head and work shown out on paper.
In Maryam’s Blog, she brings up a very interesting point, “how do we talk to computers?” Well, the simple answer is coding. The complicated answer is learning coding. There are so many codes that are out there right now. Java, HTML, and MySQL, are only three languages out of the many. Only programmers can communicate with technology and make them operate the way they should. If you’re not a programmer, either learn technology, or good luck.
Max’s blog states another example of how much we rely on the internet today. It’s crazy because we rely on research, answers to our questions, and educational material. All of these can be found on the world wide web and we don’t have to go back in time and go to a public library where we have to search through books for our answers.
In this last blog, I could not have agreed any better with the author. The author says that the way we rely on technology is actually important. A computer thinks faster and more efficiently than we humans do. But my question becomes, does that mean we humans have to stop thinking? Absolutely not!
When doing research, I never thought about checking my browser history to see how I go about answering the topic at hand. After I wrote a response about the nugget I chose from Vannuver’s essay, I checked my browser history to see how my thinking process was. My thought process seems to be somewhat vivid, and the way I think it seems that I need visuals no matter what the circumstance is. When I research, I find out that I also like to add some sort of comedic value to any kind of post, whether it’s a blunt joke, or somewhat satirical.
If I were to compare my thinking process for nugget one in comparison to my first blog post, I would continue to say that it is accurate. I think about the past and future and how what I read can affect me personally, It makes me think about how I can be affected whether negative or positive and no matter how negative or positive the situation may be, I add comedic value regardless. Life is too short to be upset, so I like to cheer myself up no matter how things end up.
Life is a mystery, and with that being said sometimes I feel like it is my thinking process as well. When I’m pondering, it’s always a scary moment for me. What goes through my head is a sequence of events from myself as a child, to potentially what my future will be like/what I want it to be like. It’s amazing really what memories come back, and I feel really nostalgic. Time flies, and what goes through my mind is thoughts from the past, and thoughts about the future. Every time I am thinking, Calvin and Hobbs is the reminder of what or how I should compose myself. Lessons of truth, lessons of happiness, life is too short to feel unhappy.
“I think night time is dark so you can imagine your fears with less distraction” – Calvin