ElasticNinja
Android Expert
remember how you learned to count? on your handsSo will changing from imperial to metric.
we can change, but decimal isnt an awful system, unlike imperial measures
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
remember how you learned to count? on your handsSo will changing from imperial to metric.
And if everyone's doing it... it must be the best thing to do.
And? A good understanding of a binary based numbering system is absolutely something that EVERY child should have in today's world.
Decimal is outdated, and I honestly don't expect it to be used much in about 150 years.
There is nothing more arbitrary than decimal.
Whether we go binary (2) based number system,
a quadrary number system (2 to the second power) ,
an octal (2 to the third power),
or a hexidecimal (2 to the fourth power),
or a 32 based numbering system (2 to the 5th power),
or even a 64 based numbering system (2 to the 6th power).
These are the numbering systems required to function in the technological world of the future, and Decimal is only hinders that.
In the UK, shots of whisky and other spirits have been standardised for a long time:
Snip
The issue I have with this is that the multipliers used in the metric system are just as arbitrary as those used in the imperial system. 10 millimeters in a centimeter, 100 centimeters in a meter, and 1000 meters in a kilometer.
We presently live in a technological age. As such why would we use an archaic system like base 10 (decimal) when much of the world's technology now uses base 16 (hexidecimal).
The metric system most definitely has it's merits... but it's neither the end all nor be all of measurement systems. Either system works for individuals who are educated in their use.
logic is why
remember how you learned to count? on your hands
we can change, but decimal isnt an awful system, unlike imperial measures
Why dear friend, do kids of tomorrow need to know the quadrary or other systems? Perhaps we teach them to read and write and use critical thinking before we implode their brain with things they likely do not really need. If there is time left over, fine.
Unless they are going to be professional forum posters, then all they really need to learn is how to copy and paste and type the word Wikepedia; the source of all human knowledge.
You said, "A good understanding of a binary based numbering system is absolutely something that EVERY child should have in today's world."
So what do you think that? Programmers, yes they need to know that stuff. So again, why?
Bob
Don't be a metric Fanboy(Smiley)
Bob
Both are matters of opinion. In this case, yours.
Almost every single profession is heading towards being computerized. Even automobile manufacturing positions now require some level of computer familiarity.
Understanding computers and the way they think isn't just about programmers, it's integrated into just about everything you do.
For instance... how many megabytes are in a gigabyte...
1024.
Why?
It's binary... 2 to the 10th power = 1024.
It's as close as binary comes to a thousand.
when you can't win, play the fanboy card
Even many programmers, depending on how high level the programming language is, don't directly deal with binary all that often.
Regardless of how high the level of programming is... we deal with binary. Unless of course, you never ever deal with files or encoding...
For instance... how many megabytes are in a gigabyte...
1024.
Why?
It's binary... 2 to the 10th power = 1024.
It's as close as binary comes to a thousand.
my point regarding binary etc is that it would give an understanding of how numbers actually work
my point regarding binary etc is that it would give an understanding of how numbers actually work
Could you elaborate on this? Explain, perhaps, how working in binary gives an understanding of how numbers work that decimal does not?
Disputed. A gigabyte is either 1000 or 1024 megabytes. It depends on who is making the claim and which standards are being followed. For example, your hard drive will claim to be one terabyte, and it is 1 000 000 000 000 bytes. If you plug it into a Windows computer, you'll be told that it is ~909 gigabytes.
Could you elaborate on this? Explain, perhaps, how working in binary gives an understanding of how numbers work that decimal does not?
But still people will use it as a source because its easier to wiki than to do actual research.Say it with me now... Wikipedia is not a reliable source.
But still people will use it as a source because its easier to wiki than to do actual research.