Sarbo Roy - CS 101: An Introduction to Computational Thinking
Here you can read online Sarbo Roy - CS 101: An Introduction to Computational Thinking full text of the book (entire story) in english for free. Download pdf and epub, get meaning, cover and reviews about this ebook. year: 2018, genre: Computer. Description of the work, (preface) as well as reviews are available. Best literature library LitArk.com created for fans of good reading and offers a wide selection of genres:
Romance novel
Science fiction
Adventure
Detective
Science
History
Home and family
Prose
Art
Politics
Computer
Non-fiction
Religion
Business
Children
Humor
Choose a favorite category and find really read worthwhile books. Enjoy immersion in the world of imagination, feel the emotions of the characters or learn something new for yourself, make an fascinating discovery.
CS 101: An Introduction to Computational Thinking: summary, description and annotation
We offer to read an annotation, description, summary or preface (depends on what the author of the book "CS 101: An Introduction to Computational Thinking" wrote himself). If you haven't found the necessary information about the book — write in the comments, we will try to find it.
Sarbo Roy: author's other books
Who wrote CS 101: An Introduction to Computational Thinking? Find out the surname, the name of the author of the book and a list of all author's works by series.
CS 101: An Introduction to Computational Thinking — read online for free the complete book (whole text) full work
Below is the text of the book, divided by pages. System saving the place of the last page read, allows you to conveniently read the book "CS 101: An Introduction to Computational Thinking" online for free, without having to search again every time where you left off. Put a bookmark, and you can go to the page where you finished reading at any time.
Font size:
Interval:
Bookmark:
CS 101
AN Introduction to Computational Thinking
Written By Sarbo Roy
Why Should I Read This Book?
Theres no question that computers play an integral role in almost every aspect of society. Everyday tasks, such as deciding where to eat and figuring out how to get there, are more efficient with the use of computers. Problems that were once deemed impossible, ranging from sequencing the human genome to remotely diagnosing rare diseases, are now feasible thanks to advances in computational speed and intelligence. Almost everyone uses some sort of computer on a regular basis, but very few actually understand what it is theyre using. We are made to believe that only those with a specialized degree or years of complex math under their belt can truly understand how computers work. Even those that are in the process of securing these prerequisites, blindly memorize seemingly archaic concepts, lost at their practical applicability and dazed with their complexity. You do not need a degree of any type to understand computer science, nor do you need any mathematical skills greater than multiplying and dividing numbers. Not only are you smart enough to understand the fundamentals of computing, you have the ability to invent the basic structure of computer science. That is the goal of this book. Not to aimlessly hurl definition after definition in hopes of you understanding the content, but for you to be able to say Hey I could have made that too. You dont have to follow along on your computer or do coding exercises to truly understand this material. This book isnt designed for that type of learning. In fact, some small details might be left out of the code so you can focus on what really matters. Dont worry about what language the code is written in or what software to download so you can run it on your computer. Those elements are important, but they are not needed to understand the fundamentals. All you have to do is read.
Click on any Image to Enlarge It
Any feedback on this content is always appreciated and can be sent to .
Special thanks to Dawn Labs for making it ridiculously easy to create beautiful looking code.
Chapter 1
Ones And Zeroes
You finally decide that you will no longer be baffled with the seemingly magical ways computers solve complex problems. You want to start this journey of becoming technologically proficient, but where to begin? How do computers even work? Just like every other question, you begin typing half of it in before Google completes your thought for you. You click on the first result and quickly learn that computers operate on only two numbers: one and zero.
At first this notion seems almost impossible. How can everything from simple text to high definition video to virtual reality all be represented by just two numbers? Even if this concept were intuitive enough to grasp, why specifically two numbers? Surely anything that can be represented by two digits can be represented more efficiently with 3 or 4. After all, our very own number system has ten numbers (09) and removing 8 of them would make our lives anything but easier. Answering these two basic questions will not only give us an insight into how computers work, it will also establish a foundation to which every other concept of computer science can be built on top of. Lets begin with the why.
Why do computers only use two numbers?
Lets suppose that we wanted to design our very own computer. Where would we even start? After all, what exactly is a computer? A traditional computer might be seen as a device that displays information to a screen, but at its core, a computer is just a machine that manipulates numbers. Every color shown on a monitor or a laptop screen is a combination of red, green and blue shown at different levels of intensity. Therefore, if we had a way to represent every shade of those three colors with numbers, we could then reproduce any image or any video (sequences of images) using just numbers. This concept can also be extended to apply to letters and words. If we assigned every letter in the English alphabet with a number, then we could theoretically show everything from simple words to the most complex novels, once again using just numbers. This is important because if we can imagine a device solely focused on representing numbers and if we had a system in place that could convert those numbers into colors and text if needed, we could then by definition create a computer.
Weve immensely simplified the complex idea of a computer into a machine solely focused on manipulating numbers. But how would such a device even work? We know that every computer operates on electricity, so we would have to design a system that takes electricity as an input and spits out a number as an output. Chances are we dont really understand the core principles behind electricity. The only thing we might know about electric current is that it varies in strength. In fact, this is all we need to know. Maybe if we had a device with nine different switches, with each switch representing a number from 1-9, we could represent any number. The stronger the electric current fed into the device, the more switches that would turn on. After all nine switches are turned on, increasing the amount of electricity will no longer create any change to the output. Similarly, if no current is running, none of the switches would be on and it could represent the number 0. If we had four of these little devices, we could represent numbers as high as 9999, as each device would at maximum represent the number 9. If we put enough of these machines together we could then represent any number, therefore any color and any word. Indeed, this is a solution to our problem of representing numbers using electricity, but is this the best possible solution?
The first problem with our solution is sheer efficiency. Lets say it takes N amount of electricity to turn on the first switch. If the amount of electric current needed to activate these switches were direct multiples of each other, it would take 9N quantities of electricity to switch on the last switch. Therefore, it would take nine times more electricity to produce the number 9 in our machine than the number 1. No problem, well just make the amount of electricity needed to flip the first switch so small that nine times that will still be insignificant. However, our device would also have to measure electricity precisely enough to distinguish between these nine levels in order to produce a number.
Unfortunately, combining thousands of these devices together would produce enough interference to make it nearly impossible to precisely measure such values of electricity. Instead of our device showing a 9, it might show an 8 or a 7 due to the interference from the other devices. Think of this problem like understanding what a group of people is saying when every member is talking. It might be easy to understand everyone when there are only four or five members to a group, but if a thousand or ten thousand people were talking it would certainly be impractical to understand what every individual is saying. Our solution to representing numbers isnt at all wrong, it just isnt realistic enough to produce at scale. On top of this, the numbers it would present could be off from the actual values we want. Can we think of a better way?
Maybe instead of nine different switches we could have eight. After all, it would be much easier measuring only eight different levels of electricity instead of nine. But our number system only has nine numbers (excluding 0) so how would we represent numbers using only eight switches? We would have to design an entirely different method of counting involving only eight numbers (and 0). For the moment, let us assume that redesigning our system of counting was possible. Why stop at eight switches then? If we had seven switches, surely, we could measure electricity even more accurately than eight. We can extend this concept further, from seven switches to six switches to five, all the way down to one switch. With one switch, any level of electricity, no matter how big or small would turn the switch on. If there is no electricity, the switch would be off. We could represent the off state using zero and the on state using one. This would eliminate our electricity measuring problem! However, we now have to figure out how to represent every number using just 1s and 0s. Maybe we would have a tally-like system where the number 99 could be represented with ninety-nine devices switched on. It would take a lot of devices, but once again it could work. If we could represent numbers with 1s and 0s more efficiently than tallies, we would able to create a computer without having to worry about the accuracy of the numbers themselves. By answering one question, weve created another. Nevertheless, we did answer it and it turns out, we answered it correctly.
Next pageFont size:
Interval:
Bookmark:
Similar books «CS 101: An Introduction to Computational Thinking»
Look at similar books to CS 101: An Introduction to Computational Thinking. We have selected literature similar in name and meaning in the hope of providing readers with more options to find new, interesting, not yet read works.
Discussion, reviews of the book CS 101: An Introduction to Computational Thinking and just readers' own opinions. Leave your comments, write what you think about the work, its meaning or the main characters. Specify what exactly you liked and what you didn't like, and why you think so.