Wow. I'm excited!
The emerging idea of putting tiny processors onto the memory chips, and thus creating a micronetwork of decentralized microprocessors which can be delegated to specialized memory microtasks IS TERNARY in structure. I have been waiting for this for a long time. I believe it heralds the beginning of Pattern-based computing systems, which will eventually mature into full ternary hardware and software, and carries a whole new way to interact with the computer, as a learning, teaching, and data processing device.
First things first:
Since we're still on the subject of Field Programmable Gate Arrays, here is a computer-tinkerists project to consider:
1. All standard ternary logic gates can be emulated by arranging a subset of binary gates. Therefore, an entirely ternary design is possible TODAY using an ordinary off-the-shelf FPGA.
2. Among other things, a ternary chip processes Analog-to-Digital data more efficiently than a binary chip. It becomes significantly more efficient with each "trit." To start: a lot of toys use 1 bit D/A converters for sound. A one bit trinary converter has almost twice the quality of a binary system. Here is more on this idea:
http://www.trinary.cc/Tutorial/Interface/Analog.htm
3. All modern programming languages are essentially binary in structure, heavily relying on binary branching and counting loops such as IF and FOR. Programmers are therefore increasingly thinking in binary thought patterns, as an incremental social phenomenon. This has long-term consequences on the way we perceive our own future, as a culture, because programmers are designing the computer systems which increasingly shape our interactions with each other.
4. Electricity is itself ternary: A wire conducts in one direction, or the other, or not at all. This is a ternary switch in its most simple form. A native ternary chip is more efficient than binary because all three states of electrical flow are used, instead of only ON/OFF. It is silly to use binary-only algorithms on a ternary chip, just as it is difficult to build a ternary chip out of binary pieces. Both are possible, but waste effort.
5. Artificial intelligence algorithms and some sophisticated search algorithms often exploit this efficient ternary design, but none are doing it at the hardware level, AFAIK.
6. Some of the first computers built (in the 1950s and 1960s) were ternary computers, and worked very well. When silicon overtook the mechanical switches, designers opted for binary, and we've been stuck with it ever since.
7. Ternary processing is more like the way your own brain processes information. That is why AI keeps exploring ternary algorithms. And I believe THIS is the most important reason to experiment with this project.
Now for the project: Build a ternary CPU on a FPGA, and use the inherent affinity for analog processing to perform some advanced experiments in generating truly random numbers. I believe it is possible to generate a continual stream of purely random numbers with a ternary design. There are some good uses for such a stream, including truly secure encryption.
The question: WHY NOT DO THIS IN BINARY?
Well, you _can_ but you need to generate entropy with an analog interface (like moving your mouse around, etc). So I respond: why not do it in ternary?
Here is a second justification: Donald Knuth in The Art of Computer Programming, writes "Balanced ternary notation is the most beautiful of all." Some calculations which are quite difficult in binary (&tc), requiring several steps of data movement, are incredibly elegant in ternary, sometimes requiring a simple inversion of the "trits."
There is a third justification which goes like this: Because of the latent necessity of designing algorithms which MUST branch TRUE or FALSE because this is simply extending the physical capability of the underlying chip, all programmers naturally think and consider their options from within a binary thinking process. This is an artificial and inefficient extension of our natural ternary mind (which, being electrical, utilizes all three states of the switch described earlier).
I personally want to demonstrate that a significant number of difficult processing challenges are more easily managed by a programmer who "thinks" in the native mode of electrical flow, instead of thinking in a limited subset of that same flow.
I also want to demonstrate that a PATTERN-based access to data is a thousandfold more efficient than the TREE-based access which all binary systems utilize today. Tree-based access is necessary because binary systems cannot "leap" into undefined space. Everything must be carefully linked to what has gone before, in the shape of a tree. Pattern-based access can utilize the same switches for multiple layers of information, and can thus "leap" into undefined space without corrupting what is already there.
This idea is nearly inconceivable from within binary thought processes, because it appears to be a form of DIVIDE-BY-ZEROing data into place (What, you mean you want the data to be in a continual state of crashing my computer?). In fact, it is much more efficient, because layers and layers of data can be built up with the same switches, as long as the switches can be triggered by a time-based pattern event which originates with a particular "seed" shape. Thus the "tree" still exists, but moves into a third dimension, not just laterally like ordinary binary databases and filesystems.
This is the actual reason I want to work on this project. I think that a whole new way of thinking about computers and the structure of our own mind is buried in the fact that some of the first computers ever designed were ternary in structure.
Now, if that did not convince you, here is one of the most interesting things you have ever seen, in the way of computers. It is one of the oldest mechanical computers ever built, and, it was ternary.
Oh, dear. It's easy to see I'm excited because I accidentally sent the previous e-mail while still editing it. Well, I'll finish it by adding the missing URL at the end of the email:
Now, if that did not convince you, here is one of the most interesting things you have ever seen, in the way of computers. It is one of the oldest mechanical computers ever built, and, it was ternary.
http://www.mortati.com/glusker/fowler/
This is truly one of the most intriguing things I have come upon in my life. Take a moment to consider this computer, designed in the 1830s.
Yes, 1 8 3 0 s.
It is possible to emulate a whole bunch of these in an FPGA. Anyone want to dig in to this FPGA project, which is entirely over my head, but with a handful of us, might get interesting?
Here is a starting point:
http://www.trinary.cc/ http://web.archive.org/web/20070315212305/http://www.niisi.ru/old/pap_for.ht...
(This is to address Oren's concern that we don't _do_ much 'round here, for I am 100% certain that it will be seen in the future as one very interesting thing that KCLUG did, back in the day. Even just talking about it is enough.)
The fork of Linux which would run on this system? We could call it Trinux? It would actually not be a fork, but I hadda think of some way to bring this around to on-topic...
Although Linux would be able to run on it, now that I come to think about it. Hmmm... That's a ways off...
-Jared
Trinux has already been used for a floppy-based security/rescue distro.
-----Original Message----- From: Jared Sent: Thursday, August 09, 2007 6:32 PM
<snip>
The fork of Linux which would run on this system? We could call it Trinux? It would actually not be a fork, but I hadda think of some way to bring this around to on-topic...
Although Linux would be able to run on it, now that I come to think about it. Hmmm... That's a ways off...
-Jared
On 8/9/07, Jared [email protected] wrote:
Wow. I'm excited! 2. Among other things, a ternary chip processes Analog-to-Digital data more efficiently than a binary chip. It becomes significantly more efficient with each "trit." To start: a lot of toys use 1 bit D/A converters for sound. A one bit trinary converter has almost twice the quality of a binary system. Here is more on this idea:
I like balanced trinary too. But you've got to sell it better:
Binary digIT=BIT Trinary digIT=TIT
This will help get people interested in your project, and they may even subscribe to your newsletter.
I like balanced trinary too. But you've got to sell it better:
Binary digIT=BIT Trinary digIT=TIT
This will help et people interested in your project, and they may even subscribe to your newsletter.
<smile>
Yes, this is a fruitful suggestion for the short term.
However, there being no such newsletter, such people would soon discover that there is nothing which interests them in the rather geeky project which has no nipples, and they would move on to other breasts. Better to stay with the industry-standard term for the smallest particle of ternary calculation: "trit." This ensures that people who participate are serious about what they're doing.
Nevertheless, you raise a good point. So here is something that WILL get people interested, once they get over the "No way, that's impossible" barrier:
Remember the fictional little black box in the mid-90s Robert Redford movie "Sneakers" which could decrypt any encrypted system? Remember SETEC ASTRONOMY and TOO MANY SECRETS? Remember how goofy they made it look on the screen (i.e. instantly decrypting graphical data in real-time with a Beta version of the code), but nevertheless such a black box is theoretically possible?
It is indeed possible. After nine years of working on this as a theory, I can finally prove that it is possible.
Listen carefully now, because I'm talking about a complete shift of paradigm, so it is normal to think "that's impossible" although admittedly: this is somewhat of a limiting starting point.
The software running inside that "impossible" black box is ternary, and uses pattern-based analysis to decrypt. Instead of decrypting a small string by using hundreds of thousands of password strings, each varying by a single character until you stumble upon the correct one, you decrypt by studying the layers of _patterns_ generated by any encryption process. Once you open the first layer, the others become successively easier, because each layer gives clues to the next.
In other words, you do not need a key. In the new paradigm, "The data is the key," no matter how encrypted it appears to be. If you really need a password, you could generate the key by looking through the layers of patterns which exist in encrypted data until you find one which is _meaningful_. And then you begin to add layers of meaning until you have the entire dataset, unencrypted. From there, it's fairly trivial to generate a key.
But what about that little meaningful seed, the first layer? Won't it be hard to find in the great big pattern of encrypted data?
That is a useful question if you're used to looking for needles in haystacks: Stop looking for the needle; look at the hay. Any one of _thousands_ of seeds will be available, so it's not difficult. This is much more elegant than brute-force decryption. You're able to open up the data directly, without using a key. In other words, the key is entirely useless, except maybe as a historical timestamp.
Now here's the part that will catch your attention:
The only way to prevent such pattern-based analysis from cracking open any binarily-encrypted clump of data, or even a packet stream, is to generate TRULY RANDOM keys, which can only be created ... you guessed it ... within a ternary pattern-based algorithm. You already know that binary algorithms can NEVER create true randomnness, but can only emulate it with ever-increasingly fine precision. Like approaching infinity; you are always approaching it, but never quite get there.
Fortunately, the most basic supporting evidence for this kind of pattern analysis is already in the public domain, but if you are skeptical, you would not believe if it were put right in front of you. So be at least curious, as a good starting point. Note that like everything else based in binary logic, the supporting evidence is rudimentary, only two-dimensional compared to the mature version which is 3-D. If you are curious, here's an encrypted hint: "Babble code is for babies."
(I'm intentionally giving away enough information so that anyone who wants to can build such a system on their own. I am poor and do not have the hardware resources to begin building the system, and it seems silly to let this be a reason to impede research.)
:-)
Nevertheless, it is obvious that this is not the place or time to go into full details. Rather than give away a secret which drives millions of dollars of R&D every year at the highest levels of all high security organizations... I think I'll just wait a little bit, until someone who is looking, and stumbles upon these posts via a Google search someday decides that there really is something useful to the ternary angle. Such a person probably already knows there is something here, and is merely researching for a little prior art, and will be delighted to see this letter.
Or perhaps, I should say, I think I'll just wait a little trit.
For that's when the very interesting conversation will begin. Until then, I'm content with tit versus trit, and happy that this conversation is now archived where all can see. And only slightly off topic.
Of course, there is also the possibility that someone on this list may be intrigued and want to start experimenting with FPGAs...
-Jared
Who else has read "The Muller-Fokker Effect" by John Sladek? http://www.amazon.com/gp/product/0881845485/?tag=tipjartransactio
I bring that up because it is a very entertaining book about a data storage medium -- the book was first published in 1970, it's funny magnetic tape -- with a fine sense of humor. Use Muller-Fokker equipment and your random pen plots become witty fractal cartoons that out-Monet Monet and out-Goldberg Goldberg at the same time.
On 8/11/07, Jared [email protected] wrote:
TRULY RANDOM
What about overloading a diode? Or timing radioactive decay? There was a firm in the late nineties that got millions of venture dollars for a random number generator that involved three video cameras pointed at lava lamps, for instance. Anyone who knew that you can get perfectly good white noise anywhere, anytime, found that disturbing.
Anyone for overloading components to make a hardware /dev/random card? Is there a market for it? I doubt there is. There are plenty of available organic inputs about to re-key your generators with; That's why openSSL times keystrokes while generating keys, for example.
Of course, there is also the possibility that someone on this list may be intrigued and want to start experimenting with FPGAs...
-Jared
The link to the page on the model of the base-3 nineteenth century adding machine was certainly interesting. I saw a bit on television once about a group that had built a model of a clockwork table that DaVinci had designed, which could be programmed to tote stuff from room to room, i think by placing pegs in a wheel which turned much more slowly than the drive wheels, to steer. (the steering was done by the pegs, or absence of pegs, IIRC.)
It's fun to imagine that several were built, and Medicis of all stripes oohed and aahed as snacks appeared carried not by a live servant but on a moving table.
David Nicol wrote:
Who else has read "The Muller-Fokker Effect" by John Sladek? http://www.amazon.com/gp/product/0881845485/?tag=tipjartransactio
I bring that up because it is a very entertaining book about a data storage medium -- the book was first published in 1970, it's funny magnetic tape -- with a fine sense of humor. Use Muller-Fokker equipment and your random pen plots become witty fractal cartoons that out-Monet Monet and out-Goldberg Goldberg at the same time.
Ah, yes! This is the kind of direction to think. Creative! Was there any way to control the Muller-Fokker equipment, or did it out-Goldberg Goldberg one day, and out-Picasso Picasso the next? I am guessing that the phrase "funny magnetic tape" means the book is "ancient, from the very beginning of the Unix epoch, when magnetic tape was used for storage. And humorous."
TRULY RANDOM
What about overloading a diode? Or timing radioactive decay? There was a firm in the late nineties that got millions of venture dollars for a random number generator that involved three video cameras pointed at lava lamps, for instance. Anyone who knew that you can get perfectly good white noise anywhere, anytime, found that disturbing.
Exactly. Such wild speculation caused the dotcom crash. :-) I suppose the emperor soon got his clothes on with that one.
However, you make a really good point: White noise is often an _effect_ captured in the digital world which is _caused_ in the real world.
Notice that in all three of your examples to achieve true randomness, you are utilizing an analog-to-digital conversion. (i.e. you are capturing a random pattern occurring in the Real World with digital annotation). Note also that ternary logic handles analog-to-digital conversion much more efficiently than binary. This is empirically true, and demonstrated mathematically here:
http://www.trinary.cc/Tutorial/Interface/Analog.htm
So the question is now refined to greater accuracy, but remains:
Is it possible to create a random number generator that has absolutely no interface with the analog, organic, real world? In other words, is entirely digital in origin?
Anyone for overloading components to make a hardware /dev/random card? Is there a market for it? I doubt there is. There are plenty of available organic inputs about to re-key your generators with; That's why openSSL times keystrokes while generating keys, for example.
Yes. Or mouse movements, or any other "analog" movement. And for contemporary uses of random numbers, these organic inputs are sufficient to create superduper truly random numbers. However, ever notice how the image CAPTCHAs are getting increasingly sophisticated, because also the software which cracks them is also? This is the same with all encryption: It works today, but tomorrow it is useless.
Within ten years, we will have computers which can crack open the random seeds you described with brute force alone, and there will be other techniques developed by then, as well. We already saw that the 700 Ghz light-based processor is just around the corner, and I imagine we'll see 1 Thz by 2012...
Today's high-security encryption is tomorrow's child's toy.
What I propose is not useful today, but will be perfect timing when it appears on the market in about one decade, when people are starting to look seriously for a way to generate true randomness without cumbersome analog interfaces. Or at least with an elegant way to interface with analog phenomena.
The link to the page on the model of the base-3 nineteenth century adding machine was certainly interesting. I saw a bit on television once about a group that had built a model of a clockwork table that DaVinci had designed, which could be programmed to tote stuff from room to room, i think by placing pegs in a wheel which turned much more slowly than the drive wheels, to steer. (the steering was done by the pegs, or absence of pegs, IIRC.)
It's fun to imagine that several were built, and Medicis of all stripes oohed and aahed as snacks appeared carried not by a live servant but on a moving table.
So Leonardo da Vinci created the first robot!
I read recently that Tesla designed a turbine which was necessarily so large that no one could afford the first prototype, until a large oil company realized that it was just what they needed. The first one was manufactured in the 1990s, and is used on big oil rigs out in the ocean. Tesla still gets the credit, even though he had no working prototype in his lifetime.
So I'm content with simply putting this idea out there, even if I never get the chance to actually build it physically. The point is to get it into the hands of the ordinary people before the military gets ahold of it... of course ... :-)
-Jared
On 8/12/07, Jared [email protected] wrote:
David Nicol wrote:
Notice that in all three of your examples to achieve true randomness, you are utilizing an analog-to-digital conversion. (i.e. you are capturing a random pattern occurring in the Real World with digital annotation). Note also that ternary logic handles analog-to-digital conversion much more efficiently than binary. This is empirically true, and demonstrated mathematically here:
I hate to get involved in what looks like it could become a perfectly good flame war, but I looked at your link.
By the same logic, we would be much better off using a decimal computer. It takes 15 trits to write 143, but I could write 999 in just 3 decimal bits (dits?)
I don't think you're going to get a lot of argument that the higher the base the fewer digits it takes to represent a number. That does not, however, make it a more efficient design for anything other than printing. I frequently write values in hex when programming or documenting things for the same reason.
Eric Johnson wrote:
Notice that in all three of your examples to achieve true randomness, you are utilizing an analog-to-digital conversion. (i.e. you are capturing a random pattern occurring in the Real World with digital annotation). Note also that ternary logic handles analog-to-digital conversion much more efficiently than binary. This is empirically true, and demonstrated mathematically here:
I hate to get involved in what looks like it could become a perfectly good flame war, but I looked at your link.
By the same logic, we would be much better off using a decimal computer. It takes 15 trits to write 143, but I could write 999 in just 3 decimal bits (dits?)
Eric, you raise a good point.
This is a keen intuitive leap, but it turns out that it is not the reason that ternary is better than binary. It is actually because electricity has _exactly_ 3 states:
+1 current flowing one way 0 no current -1 current flowing the other way
Because of this, ternary is optimal, because electricity is itself ternary. If you try and build quaternary or greater gates, you create WAY more complexity than you need to. To build quaternary gates, you actually create ternary plus unary. And then to build "quintinary" gates, you build ternary plus binary. And so forth. Decimal would be a real mess.
Electricity itself is ternary. That's why ternary gates are the most efficient. Binary conversion is 'clipping' one third of the three-part A/D conversion, whereas ternary is keeping that third.
I don't think you're going to get a lot of argument that the higher the base the fewer digits it takes to represent a number. That does not, however, make it a more efficient design for anything other than printing. I frequently write values in hex when programming or documenting things for the same reason.
You are correct. The argument is not towards "higher base" but rather it is towards:
"a base which accurately expresses the natural capacity of electrical flow."
This should be sufficient answer to the observation you made.
Part II: How to keep this from becoming a flame war.
As for flame war, you have just introduced the most interesting real-time proof of the efficiency of ternary logic. And for this reason, I am going to end these conversations, because the point is entirely made. Here goes:
The concept of "War" is itself a binary concept, being perfectly opposed to "Peace." In binary conversation, you are either in one state or the other. In ternary, there is another option. Let us call this one "Abeyance" which is an ancient term meaning something like "undecided." Or perhaps "learning."
Abeyance happens to be a perfectly useful state which is neither war nor peace. Here is how it works: If you will go back through the seven posts I have written in this conversation, you will see something interesting happening, which does not always happen in online conversations.
At every juncture where someone found reason to "disagree," I promptly answered: "You are correct," and went on to show how the disagreement was not a complete rejection of the theory, but only a slight disagreement and moreso a valid observation in favor of it.
This is how ternary operates. The "middle ground" which is normally excluded from conversation because a person is either RIGHT or WRONG, is actually the most important part of conversation. It is where a person is in a state of flux, being part way between one or the other binary poles.
Abeyance.
Thus, it is _impossible_ to get into a flame war with ternary logic, because at every juncture, the ternary thinker says "Wow. You are absolutely correct." How can you be at war with someone who is incrementally agreeing with you at every stage of the conversation? Some people say "you're tricking me!" But in fact, this is not an outward manipulation, this is actually what is happening. Look back at the conversation and you will see.
And that, being as real as it gets, is sufficient to introduce the beauty of ternary logic which Donald Knuth referred to when he said:
"Balanced ternary is the most beautiful numbering system in math."
He wrote this in The Art of Computer Programming many years ago. And it is still true. Now who's gonna argue with Donald Knuth?
As the ensuing conversation, in which it is impossible to have a flame war, could take a long time, and yet be friendly all the way, I now respectfully request this conversation go off-list so we can learn more about the 710 Mhz processor and other such eastward flying falcons.
G'day.
-Jared
Knuth has been brought up, so the thread must be over.
I have to add that I regularly use perl's base-26 incrementation for creating file names. Try it yourself:
perl -wle 'BEGIN{$L ="a"} print $L++ for 1..100'
On 8/12/07, Jared [email protected] wrote:
Eric Johnson wrote:
Notice that in all three of your examples to achieve true randomness, you are utilizing an analog-to-digital conversion. (i.e. you are capturing a random pattern occurring in the Real World with digital annotation). Note also that ternary logic handles analog-to-digital conversion much more efficiently than binary. This is empirically true, and demonstrated mathematically here:
I hate to get involved in what looks like it could become a perfectly good flame war, but I looked at your link.
By the same logic, we would be much better off using a decimal computer. It takes 15 trits to write 143, but I could write 999 in just 3 decimal bits (dits?)
Eric, you raise a good point.
This is a keen intuitive leap, but it turns out that it is not the reason that ternary is better than binary. It is actually because electricity has _exactly_ 3 states:
+1 current flowing one way 0 no current -1 current flowing the other way
No. To deal with analog to digital, electricity has a near infinite number of states. I would prefer a system that can measure 10 voltage levels than one that can only measure directions and no current. I can easily set up the circuit so that measurements can be taken in each direction.
Because of this, ternary is optimal, because electricity is itself ternary. If you try and build quaternary or greater gates, you create WAY more complexity than you need to. To build quaternary gates, you actually create ternary plus unary. And then to build "quintinary" gates, you build ternary plus binary. And so forth. Decimal would be a real mess.
Electricity itself is ternary. That's why ternary gates are the most efficient. Binary conversion is 'clipping' one third of the three-part A/D conversion, whereas ternary is keeping that third.
I don't think you're going to get a lot of argument that the higher the base the fewer digits it takes to represent a number. That does not, however, make it a more efficient design for anything other than printing. I frequently write values in hex when programming or documenting things for the same reason.
You are correct. The argument is not towards "higher base" but rather it is towards:
"a base which accurately expresses the natural capacity of electrical flow."
This should be sufficient answer to the observation you made.
Part II: How to keep this from becoming a flame war.
As for flame war, you have just introduced the most interesting real-time proof of the efficiency of ternary logic. And for this reason, I am going to end these conversations, because the point is entirely made. Here goes:
The concept of "War" is itself a binary concept, being perfectly opposed to "Peace." In binary conversation, you are either in one state or the other. In ternary, there is another option. Let us call this one "Abeyance" which is an ancient term meaning something like "undecided." Or perhaps "learning."
Abeyance happens to be a perfectly useful state which is neither war nor peace. Here is how it works: If you will go back through the seven posts I have written in this conversation, you will see something interesting happening, which does not always happen in online conversations.
At every juncture where someone found reason to "disagree," I promptly answered: "You are correct," and went on to show how the disagreement was not a complete rejection of the theory, but only a slight disagreement and moreso a valid observation in favor of it.
This is how ternary operates. The "middle ground" which is normally excluded from conversation because a person is either RIGHT or WRONG, is actually the most important part of conversation. It is where a person is in a state of flux, being part way between one or the other binary poles.
Abeyance.
Thus, it is _impossible_ to get into a flame war with ternary logic, because at every juncture, the ternary thinker says "Wow. You are absolutely correct." How can you be at war with someone who is incrementally agreeing with you at every stage of the conversation? Some people say "you're tricking me!" But in fact, this is not an outward manipulation, this is actually what is happening. Look back at the conversation and you will see.
And that, being as real as it gets, is sufficient to introduce the beauty of ternary logic which Donald Knuth referred to when he said:
"Balanced ternary is the most beautiful numbering system in math."
He wrote this in The Art of Computer Programming many years ago. And it is still true. Now who's gonna argue with Donald Knuth?
As the ensuing conversation, in which it is impossible to have a flame war, could take a long time, and yet be friendly all the way, I now respectfully request this conversation go off-list so we can learn more about the 710 Mhz processor and other such eastward flying falcons.
G'day.
-Jared
Kclug mailing list [email protected] http://kclug.org/mailman/listinfo/kclug
Eric Johnson wrote:
On 8/12/07, Jared [email protected] wrote:
Eric Johnson wrote:
Notice that in all three of your examples to achieve true randomness, you are utilizing an analog-to-digital conversion. (i.e. you are capturing a random pattern occurring in the Real World with digital annotation). Note also that ternary logic handles analog-to-digital conversion much more efficiently than binary. This is empirically true, and demonstrated mathematically here:
I hate to get involved in what looks like it could become a perfectly good flame war, but I looked at your link.
By the same logic, we would be much better off using a decimal computer. It takes 15 trits to write 143, but I could write 999 in just 3 decimal bits (dits?)
Eric, you raise a good point.
This is a keen intuitive leap, but it turns out that it is not the reason that ternary is better than binary. It is actually because electricity has _exactly_ 3 states:
+1 current flowing one way 0 no current -1 current flowing the other way
No. To deal with analog to digital, electricity has a near infinite number of states. I would prefer a system that can measure 10 voltage levels than one that can only measure directions and no current. I can easily set up the circuit so that measurements can be taken in each direction.
Actually, yes. Your facts are correct, but you are using them to justify a quantity-based measurement, and ternary is quality-based. You are looking at the point-of-measuring, and I am looking at the mathematical _conversion_ between that analog measuring point and the digital storage.
This whole Quality debate was well exercised in Zen and the Art of Motorcycle Maintenance, so we don't need to go into all the details here.
You are correct that analog current has a near infinite number of states, continually flowing. You are thinking of the '3' as meaning '3 measuring points.' This is not what is meant by ternary A/D conversion. What we're talking about is how to quickly and efficiently move the finest grains of data possible, and to do that we are converting from fractions to discrete numbers.
I quote, again from the trinary.cc website, which I believe is published out of England since at least the late 1990s. It represents the work of an engineer whose statements are all mathematically proveable. There are other websites which discuss these things, but his is the clearest for our purposes. Look closely at what he is saying:
<extended quote> Fractions are represented as a base number raised to a negative exponent. They are commonly a source of error in scientific calculations. Fractions are inherently analog in nature. Computers are discrete. This means that computers approximate fractions as closely as possible - but can be off by as much as half the value of the smallest digit.
For example, lets see how 0.6 would be represented using only 3 digits. Based on the following table, we get the following results:
Base 2: 101 = .500 + .000 + .125 = .625 The margin of error is: .625 - .600 = .025 Error
Base 3: 121 = .333 + .222 + .037 = .592 The margin of error is: .600 - .592 = .008 Error
As you can see, base 3 is much more accurate.
The use of fractions becomes important whenever floating point numbers are used. Floating point numbers use scientific notation as the basis of representing a number. For example, the IEEE standard for long double precision floating point is 80 bits in size: 64 bits for the number, 15 bits for the exponent, and 1 for the sign. A typical PC is only 32 bits wide. This number takes 3 memory cycles to access it. To get roughly the same precision with trinary digits, it would take: 40 for the number, 9 for the exponent, and 1 for the sign. For a 27 trit trinary computer, this would only take 2 memory accesses to read the number.
However, if using a trinary computer, we would have 4 trits left over (2 memory accesses is 54 trits - 40 - 9 - 1 = 4 trits unused) that could be re-assigned so that they are meaningful. (e.g. - 42 for the number, 11 for the exponent, and 1 for the sign.) This would surpass the capabilities of IEEE long double precision floating point calculations by roughly 8 times in both precision and magnitude
Why 3 and not 4, 5, 6... ?
The trinary math system utilizes the 3 natural states of electrical current flow. A wire conducts in one direction, or the other, or not at all. Base 4 would need to have 4 states, which don’t naturally exist. A designer would need to use discrete voltage levels to make it work. This leads to noise margin problems and increased power consumption because the transistors will need to be in the active state. If the designer tried to quantize the numbers for mathematical operators, he would have to build 4 window detectors to signal when voltage represents a specific number. Just detecting the individual numbers make anything above base 3 unwieldy. </endquote>
http://www.trinary.cc/Tutorial/Introduction/Intro2.htm
The sum is that .333 is more useful mathematically than .5 when creating fine-grained fractions. It is really, as Knuth said, a very beautiful way of doing math, much more elegant than binary or decimal. Not only mathematically, but ON THE CHIP it is physically easier to manipulate the data in certain calculations -- sometimes by simply inverting the trits, you perform a complex calculation which requires several steps in binary.
I really am not coming from a position of opinion. Rhetoric yes, but what I am presenting are facts and a new way of looking at them which is quite amazing. For example, the trinary.cc guy is unaware of the implications of ternary logic on random number generation which I have brought into this conversation.
-Jared
p.s. for anyone still reading. the idea that someone is WRONG is itself a binary assumption which virtually disappears when you start thinking in ternary. Instead of saying someone is WRONG, you simply say "Oh, he hasn't yet completed his journey on that subject..." And then you have a moral imperative to help him learn, instead of a moral imperative to "correct" him. Unless of course, he really is wrong, which is exceedingly rare, like one per billion or so.
So here is the summary so far:
1. A TRUE RANDOM NUMBER GENERATOR AND ENCRYPTION OPENER. 2. BETTER QUALITY DIGITAL TO ANALOG INTERFACES. 3. WORLD PEACE.
What's next, the Grand Unifying Theory itself? The cure for cancer?
One researcher in Central Peru is working with the Aymara language, the only language in the world which is natively ternary in structure. Roughly 2 million people speak it, and his project is to build a SuperTranslator which accurately translates meaning between ALL languages, by translating them in and out of Aymaran, which is a "perfect language." (This sounds weird, but linguists immediately recognize what I'm saying here, so I'll just say: it's possible).
Suffice it to say: there's something really amazing about ternary logic which is not yet fully understood, and which changes everything it touches, for the better.
Sigh. :-)
Honest, I'll quit responding on this subject if you will. I'm done. It is successfully archived where surfers at archive.org will be able to find it well into the future.
But if you write again and say I'm wrong, I'll write back and show you that YOU ARE CORRECT and, interestingly, I am also.
G'night.
On Monday 13 August 2007, Jared wrote:
For example, lets see how 0.6 would be represented using only 3 digits. Based on the following table, we get the following results:
Base 2: 101 = .500 + .000 + .125 = .625 The margin of error is: .625 - .600 = .025 Error Base 3: 121 = .333 + .222 + .037 = .592 The margin of error is: .600 - .592 = .008 Error
As you can see, base 3 is much more accurate.
You're not making a valid comparison. 3 digits of base 2 only has 8 possible combinations while 3 digits of base 3 gives 27 (237% more!). To be fair, you would need to compare 8 digits of base 2 with 5 digits of base 3, and then base 2 would have only a slight advantage of 5% more possibilities.
The trinary math system utilizes the 3 natural states of electrical current flow. A wire conducts in one direction, or the other, or not at all. Base 4 would need to have 4 states, which don’t naturally exist. A designer would need to use discrete voltage levels to make it work. This leads to noise margin problems and increased power consumption because the transistors will need to be in the active state. If the designer tried to quantize the numbers for mathematical operators, he would have to build 4 window detectors to signal when voltage represents a specific number. Just detecting the individual numbers make anything above base 3 unwieldy.
That's an efficiency and implementation problem and nothing more.
p.s. for anyone still reading. the idea that someone is WRONG is itself a binary assumption which virtually disappears when you start thinking in ternary. Instead of saying someone is WRONG, you simply say "Oh, he hasn't yet completed his journey on that subject..." And then you have a moral imperative to help him learn, instead of a moral imperative to "correct" him. Unless of course, he really is wrong, which is exceedingly rare, like one per billion or so.
This sounds exceedingly relativistic. Objective statements are either right or wrong. If someone believes a falsehood through ignorance, they may not be at fault, but the fact is they're still wrong.
DaVinci made at least three (that I recall from a book or program) clockwork entertainment devices for the king or ruling person. They were run by spring-stored energy like a wind-up toy. They would roll out a certain distance, play music or do something entertaining and then return to point of origin. Not really a robot, more automaton or wind-up mannequin. The gears and springs and some drawings have been figured out to prove this apparently. DaVinci built this stuff to wow the ruler so they would keep paying him regularly to make cool stuff, in a nutshell. Many of his designs were never built or did not withstand time and weather.
I imagine the Medicis would think it cool if snacks came out on a moving table, so would many people today. I don't doubt that he came up with many more things we will never know about.
-----Original Message----- From: David Nicol Sent: Sunday, August 12, 2007 1:07 AM
<snip>
The link to the page on the model of the base-3 nineteenth century adding machine was certainly interesting. I saw a bit on television once about a group that had built a model of a clockwork table that DaVinci had designed, which could be programmed to tote stuff from room to room, i think by placing pegs in a wheel which turned much more slowly than the drive wheels, to steer. (the steering was done by the pegs, or absence of pegs, IIRC.)
It's fun to imagine that several were built, and Medicis of all stripes oohed and aahed as snacks appeared carried not by a live servant but on a moving table.
On 8/11/07, Jared [email protected] wrote:
breasts. Better to stay with the industry-standard term for the smallest particle of ternary calculation: "trit."
I didn't know there was an industry. I thought the whole point of the discussion was that the industry was strictly binary.
So here is something that WILL get people interested, once they get over the "No way, that's impossible" barrier:
. . .
The software running inside that "impossible" black box is ternary, and uses pattern-based analysis to decrypt. Instead of decrypting a small string by using hundreds of thousands of password strings, each varying by a single character until you stumble upon the correct one, you decrypt by studying the layers of _patterns_ generated by any encryption process. Once you open the first layer, the others become successively easier, because each layer gives clues to the next.
Anyone using a password like this isn't really protecting their data from an attacker with crypto knowledge (or access to common cracking tools).
The only way to prevent such pattern-based analysis from cracking open any binarily-encrypted clump of data, or even a packet stream, is to generate TRULY RANDOM keys, which can only be created ... you guessed it ... within a ternary pattern-based algorithm. You already know that binary algorithms can NEVER create true randomnness, but can only emulate it with ever-increasingly fine precision. Like approaching infinity; you are always approaching it, but never quite get there.
I must call bullshit on this. Ternary computation does not possess any magical properties. A number is a number, and the same algorithm written in base 2, 3, 8, or 10 can also be done in 16, 256, etc. The reason why computers normally can't do randomness is that they aren't designed for it. They take inputs and perform calculations in such a way that the same inputs always generate the same outputs. They are specifically designed to do things like error correction to prevent producing different outputs from the same inputs.
If you want to add randomness to computers, you need to design circuits to produce random data. Start with something like a Geiger counter. Every time it records a hit, take the time of the event (with a clock that can do millisecond or finer resolution), subtract the previous event time, and throw away all but the last few bits or trits of the delta. Take as additional events every valid SYN-ACK packet received by the TCP stack, keypresses that are at least 3 seconds after the previous keypress (you don't want to use the internal timings of typing, because a person might have a rhythm that introduces a bias into the data), and the pixels of webcams aimed at busy intersections. The idea is to use a lot of different sources for the events, so that an attacker with access to one source will be unable to know what other events intervened between those of which he is aware.
If you are curious, here's an encrypted hint: "Babble code is for babies."
Here's my encrypted answer: "The Falcon flies east in the evening."
Monty J. Harder wrote:
On 8/11/07, *Jared* <[email protected] mailto:[email protected]> wrote:
breasts. Better to stay with the industry-standard term for the smallest particle of ternary calculation: "trit."
I didn't know there was an industry. I thought the whole point of the discussion was that the industry was strictly binary.
Yes: the term "trit" goes back to the 1960s at least. There actually were some ternary computers built in Russia in the 1960s, and some experiments were done here as well. Only in the past few years has there been a resurgence of interest in this interesting "industry." So, yes, there is an industry, and one of its internal debates is whether to use the word "trinary" or "ternary," with the latter being more technically accurate, but the former being more popular.
The whole matter is easily confused with other, specialized uses of the term "ternary," with the famous example being the most elegant way to store data for quick retrieval: use a ternary search tree. Any C programmer probably learned about this in his intro classes. Fortunately, this example is not too distant from what we're talking about here, so it all works out in the end. The point is, ternary structures ARE REALLY INTERESTING.
The software running inside that "impossible" black box is ternary, and uses pattern-based analysis to decrypt. Instead of decrypting a small string by using hundreds of thousands of password strings, each varying by a single character until you stumble upon the correct one, you decrypt by studying the layers of _patterns_ generated by any encryption process. Once you open the first layer, the others become successively easier, because each layer gives clues to the next.
Anyone using a password like this isn't really protecting their data from an attacker with crypto knowledge (or access to common cracking tools).
True. You do have a point here. Let me more precise then, what we're discussing is the case where data is available, but in encrypted form without a key. In such cases, a key will soon be unnecessary....
The only way to prevent such pattern-based analysis from cracking open any binarily-encrypted clump of data, or even a packet stream, is to generate TRULY RANDOM keys, which can only be created ... you guessed it ... within a ternary pattern-based algorithm. You already know that binary algorithms can NEVER create true randomnness, but can only emulate it with ever-increasingly fine precision. Like approaching infinity; you are always approaching it, but never quite get there.
I must call bullshit on this. Ternary computation does not possess any magical properties. A number is a number, and the same algorithm written in base 2, 3, 8, or 10 can also be done in 16, 256, etc. The reason why computers normally can't do randomness is that they aren't designed for it. They take inputs and perform calculations in such a way that the same inputs always generate the same outputs. They are specifically designed to do things like error correction to prevent producing different outputs from the same inputs.
You are correct, but you are also thinking about ternary computation using a binary mindset, which implies that a statement is EITHER true OR false, with an excluded middle.
Use polyvalent logic, the third option, that a statement is "true under certain conditions" and "false under other conditions," and therefore its truth value is UNKNOWN, YET KNOWABLE. Now _this_ is the way to look at what is before you. If you continue to look at it with binary eyes, you will continue to see a pile of BS. I'm okay with that.
As you said, the fact that "a number is a number" is true. There are no magical properties in base-3 counting, nor 3nary logic. HOWEVER, base-3 logic does do something with that "excluded middle" which is really amazing, and binary folks don't normally see it:
Instead of leaping from one extreme to the other, it opens up smoothly into an infinite recursion, like fractals do. For example, something which is VIRTUALLY TRUE, being very close to completely true, is in this middle area. Likewise so is something VIRTUALLY FALSE, and so is the WHOLE RANGE BETWEEN. To binary eyes, it looks like a single empty area, neither true nor false and therefore useless. In truth it is actually a spectrum: a window into infinity which has much more smooth transitions than binary.
It is _this_ incremental ability that makes ternary logic much closer to analog -- and strangely, quaternary logic is worse, not better.
Ternary, and ternary alone, is really well suited to interfacing with the analog world, and though you are correct: "the numbers do not change," the way we use the numbers does change.
If you want to add randomness to computers, you need to design circuits to produce random data. Start with something like a Geiger counter. Every time it records a hit, take the time of the event (with a clock that can do millisecond or finer resolution), subtract the previous event time, and throw away all but the last few bits or trits of the delta. Take as additional events every valid SYN-ACK packet received by the TCP stack, keypresses that are at least 3 seconds after the previous keypress (you don't want to use the internal timings of typing, because a person might have a rhythm that introduces a bias into the data), and the pixels of webcams aimed at busy intersections. The idea is to use a lot of different sources for the events, so that an attacker with access to one source will be unable to know what other events intervened between those of which he is aware.
Yes, these examples are good. Note, however, all of these are reliant upon an interface with the analog, or "real" world. What I am talking about is generating such randomness entirely within the digital world, or at least so elegantly that there is no need for a geiger counter, keyboard, or mouse movements.
If you are curious, here's an encrypted hint: "Babble code is for babies."
Here's my encrypted answer: "The Falcon flies east in the evening."
Excellent. Now we're getting playful, which is the whole point of this exercise.
I have no idea what your encrypted answer means; someday I will, but for now I am content with it as an amusing puzzle. :-)
It will be much easier to program the FPGA if we're laughing all the way to the bank...
-Jared
p.s. for people still reading:
I met an old programmer, this guy is in his 70s now. He described the first program he wrote, which was a way to store and retrieve recursive sets of data for materials used in high power KCPL transmission wires. He had an interesting labor converting the program to C a few years later, because C does not have the ternary structure he originally used.
In C, he created nested IF statements to emulate the more elegant ternary structure he had used in the 1960s. I am certain that programmers in the early 1960s thought differently about their task than we do -- they thought in ternary, and we think in binary, because the ON/OFF transistor has shaped the way we think about programming logic in general. Before the transistor was invented, people thought more fluidly than we do.
A few languages: FORTH, Smalltalk, JavaScript, and others, have ways to extend the language dynamically, which lend them to creating ternary structures easily. C has no such provision, and probably won't for a long time yet. FORTH is a fork from DSSP, which is the software which ran on the ternary computers in Russia back in the 60s, mentioned earlier. It is also still around, and I think it is the language used with FPGAs, IIRC.
On 8/12/07, Jared [email protected] wrote:
"industry." So, yes, there is an industry, and one of its
internal debates is whether to use the word "trinary" or "ternary," with the latter being more technically accurate, but the former being more popular.
I think 'trinary' works well to stress that it's an alternative to binary. But if you use ternary, you won't have trits, you'll have teits, or terits, or terts, or something.
True. You do have a point here. Let me more precise then, what we're discussing is the case where data is available, but in encrypted form without a key. In such cases, a key will soon be unnecessary....
I don't think so. There are keys that are sufficiently long that no brute-force attack is practical, because the size of the keyspace is on the order of the number of atoms in the planet.
The only way to prevent such pattern-based analysis from cracking
open any binarily-encrypted clump of data, or even a packet stream, is to generate TRULY RANDOM keys, which can only be created ... you guessed it ... within a ternary pattern-based algorithm. You already
Bullshit. There's nothing about a ternary algorithm that enhances generation of randomness. An algorithm either has random INPUTS or it doesn't. Given the same inputs, any algorithm always produces the same answer. If it doesn't, it's no good as an algorithm.
Use polyvalent logic, the third option, that a statement is
"true under certain conditions" and "false under other conditions," and therefore its truth value is UNKNOWN, YET KNOWABLE. Now _this_ is the way to look at what is before you. If you continue to look at it with binary eyes, you will continue to see a pile of BS. I'm okay with that.
Why not use a quaternary system of False/True/Unknown Yet Knowable/UnKnowable?
Instead of leaping from one extreme to the other, it opens up smoothly
into an infinite recursion, like fractals do. For example, something which is VIRTUALLY TRUE, being very close to completely true, is in this middle area. Likewise so is something VIRTUALLY FALSE, and so is the WHOLE RANGE BETWEEN. To binary eyes, it looks like a single
Ah. We need to sex it up then, and have False/Virtually False/Virtually True/True/UYK/UK.
empty area, neither true nor false and therefore useless. In truth
it is actually a spectrum: a window into infinity which has much more smooth transitions than binary.
Six not enough? How about a tuple: The first value range from 0 meaning known false to 100 for known true, and the second (again ranging from 0 to 100 describes the level of confidence, where 0 means totally unknown, and 100 is absolutely certain)? When the second value hits 100, only 0 and 100 would be normal valid values for the first. The special value (50,100) might mean 'Unknowable', because the 100 says that you know (as much as you can know). Other (x,100) values would be available for things I haven't thought of yet. There's nothing magic about 100 though. It could be 0..15 for each value, neatly fitting the tuple into a byte. Nor do the two values have to use the same range of values.
It is _this_ incremental ability that makes ternary logic much closer
to analog -- and strangely, quaternary logic is worse, not better.
Bullshit. 3 is one closer to infinity than 2. It is perfectly suited to the unique case of representing binary values plus an extra value for "I don't know", but imperfectly suited for representing 3 values plus IDK, where quaternary is better.
It doesn't even work all that well for your 'current in a wire' example. A wire can have current flowing in either direction at a certain voltage, no current, or alternating current with a certain frequency and voltage amplitude. The latter can have more complex waveforms suggesting multiple frequencies and amplitudes. And noise.
Hell, that right there would be a great generator for randomness. Subtract your ideal sine wave from the actual waveform of the AC coming into the machine and use the least-significant bits/trits/quits of it. I wonder what it takes to build a chip for that and tie it to the inside of the power supply where it's not easily tampered with. I'd need something else for this laptop though. A USB cable that runs over to the AC side of the brick, where the waveform monitoring chip would do its work, and a daemon to take the randomness produced by that chip and accumulate it in a file for later use when I'm on battery power.
Yes, these examples are good. Note, however, all of these are reliant upon an interface with the analog, or "real" world. What I am talking about is generating such randomness entirely within the digital world, or at least so elegantly that there is no need for a geiger counter, keyboard, or mouse movements.
The computer exists in reality. The way to get randomness out of electronics is to specifically design them to NOT do error correction and detection. There's no reason why a chip can't be made to accumulate randomness and dispense N bits of it at a time when requested. It's just a matter of whether there's enough demand for it. Not a damn thing to do with the base you count in.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Monty J. Harder wrote:
Hell, that right there would be a great generator for randomness. Subtract your ideal sine wave from the actual waveform of the AC coming into the machine and use the least-significant bits/trits/quits of it. I wonder what it takes to build a chip for that and tie it to the inside of the power supply where it's not easily tampered with. I'd need something else for this laptop though. A USB cable that runs over to the AC side of the brick, where the waveform monitoring chip would do its work, and a daemon to take the randomness produced by that chip and accumulate it in a file for later use when I'm on battery power.
The problem is, the 'noise' can be polluted by external forces (ie: the men with black hats who want to read all your files).
The commercial products that generate randomness (yes, they do exist) typically use something like thermal noise (or some other random physical phenomena) and are (hopefully) carefully designed to avoid introducing bias or otherwise polluting the random data:
http://en.wikipedia.org/wiki/Hardware_random_number_generator
- -- Charles Steinkuehler [email protected]
Trinux has already been used for a floppy-based security/rescue distro.
The fork of Linux which would run on this system? We could call it Trinux? It would actually not be a fork, but I hadda think of some way to bring this around to on-topic...
Okay, how about "Tripolar Linux." I already have the http://tripolar.sourceforge.net domain, and Tripolar Ajax is taking a long time to finalize, so it would be trivial to get permission from the project owner of that domain.
-Jared