Overunity.com Archives is Temporarily on Read Mode Only!



Free Energy will change the World - Free Energy will stop Climate Change - Free Energy will give us hope
and we will not surrender until free energy will be enabled all over the world, to power planes, cars, ships and trains.
Free energy will help the poor to become independent of needing expensive fuels.
So all in all Free energy will bring far more peace to the world than any other invention has already brought to the world.
Those beautiful words were written by Stefan Hartmann/Owner/Admin at overunity.com
Unfortunately now, Stefan Hartmann is very ill and He needs our help
Stefan wanted that I have all these massive data to get it back online
even being as ill as Stefan is, he transferred all databases and folders
that without his help, this Forum Archives would have never been published here
so, please, as the Webmaster and Creator of this Forum, I am asking that you help him
by making a donation on the Paypal Button above
Thanks to ALL for your help!!


Let's crack Sloot algorithm - infinite "compression"

Started by nix85, July 16, 2020, 12:57:03 PM

Previous topic - Next topic

0 Members and 2 Guests are viewing this topic.

nix85

i know this is not overunity, but it kinda is..

it has already been discussed here https://overunity.com/7456/another-strange-story/

in case you didn't hear about it, jan slooth was TV tehnician from netherlands who after 20 years of research in 1995 created a data sharing technique which could allegedly store a complete movie in 8 kilobytes. 1 day before he would hand over the code an would be a multimillionaire he died of a "heartattack"

he allegedly played 16 movies simultaneously from a 64kb phone card

he said it was "so simple a simple tehnician could figure it in a second if shown the source code"

he was EXTREMELY paranoid and secretive about it

>> https://www.youtube.com/watch?v=jLp5gS9h9UI

data to be sent was 2 million times smaller than the original.

if taken literally, this would be like to squeeze lets say 1999999 various numbers into 1 digit. impossible like that of course, information entropy..

...but is it?

i'll first quote a guy that left a comment here http://jansloot.telcomsoft.nl/Sources-1/More/CaptainCosmos/Not_Compression.htm#.Xdyn0PlKiCq

QuoteYou're allmost there. Sloot just used an analog converter. The analog signal was created by a device that was parametrized by the data on the chipcard. The precise digital measuring of the output of the analog signal then generates the digital sequence. Digital compressors are extremely limited because computers use integer or real data to work with. Analog systems can contain millions (some unlimited) of times the quantity of information that digital systems contain.

Sloot was a tv-technician in the transition period from high-quality analog technology to the digitalisation. He made the link between the two. His technique would or will ruin a whole industry, making ''our'' hich-tech IT based on digital processing completely obsolete.

For the one who does not understand the upper part: Consider a circle. The relation between diameter and circumference is the nubmer PI. Thats, 3.1415... and an inifinite numer of digits never recurring. Thats quite an amount of information in a simple analog thing.

There are other simple analog things that contain lots of information, even if you stay in the ''rational number'' range. Every movie, ever made on DVD and goïng to be made on DVD or every file on a computer in the world can be drawn by a single line on one sheet of paper: the slope (Y/X) as a finite row of digits can be the result of a digital measurement. Replace the sheet with a (virtual) tv-screen, and you have the SLOOT-compression, or something like. It's a digital to analog- and reverse thing. No more no less.

Sloot himself spoke about the ''end of the digital era''.

but does it really have to be analog?

sloot was generating these tiny keys by comparing source file to a database, 70mb for each type of data, videos, sound, text, pictures, and reverse was done one the other end.

Roel Pieper, former CTO and board member of Philips, is quoted as saying (translated from Dutch):

QuoteIt is not about compression. Everyone is mistaken about that. The principle can be compared with a concept as Adobe-postscript, where sender and receiver know what kind of data recipes can be transferred, without the data itself actually being sent.

indeed, there is no compression here.

remember that since data is so extremelly "reduced", same 8kb key must be able to represent many different sets of data, different movies. but how.

WITH DYNAMIC COMPONENTS ON BOTH ENCODING AND DECODING SIDE, PARAMETRIZED BY SIMPLE SET OF VARIABLES ON THE BEGINNING OF THE KEY.

this is the only way this could have worked, even theoretically

to put it most plainly, we can represent any file of any kind as simply a sequence of numbers, so goal is to reduce 1 page full of random numbers to let's say 3/4 or 2/3 or even 1/2 of the page without data lost, with multiple iterations near infinite "compression" results. but how.

like i said, if we have a certain mechanism generating random data, let's say 10 generators of random numbers on encoder and decoder side.

lets say when movie is about to be turned into a 8kb key, those random generators are parametrized by exact time and date number, so its always unique.

totally unique and random datastream will be produced by the oscillators, THE SAME datastream on both sides.

we also have the same algorithm and same 70mb (can be more) library on both sides.

70mb library was probably an index of random numbers and algorithm had to first recognize which chunks of generated random numbers correspond to the source file..

maybe then it just sent the number of the table in which the big number was to be found and in combination with the timestamp software on the other side could look into that table, compare it with its own datastream at that timestamp moment and extract the usable data.

this is just a crude idea, probably not the best one, but i believe it was somewhere along these lines.

i also stumbled upon this old video, is it fake i dont know, claiming to reduce any file to 256 bytes

https://www.youtube.com/watch?v=kQsWP6n03EU

as sloot himself said it was extremely simple


WhatIsIt

Probably will not be right, but can try.

Letters are 32 bit numbers each, 1 integer.


We have sequence ABCD, which are 4 * 32 bit number (integers),

but it has only 256 iterations, which we can write in 1 integer or less

For combination AAAA, we write number 0, and max is 255 (DDDD).


Can we apply this further?



Because there are 26 letters, number of iterations (combinations) for ABCD will be 456976 for all possible letters in that 4 symbol block,

26 * 26 * 26 * 26 = 456976

and 456976 we can also write inside 1 integer instead of 4 integers.

AAAA = 0

ZZZZ = 456975


So, we write number of sequence (iterations) of blocks of 4 or more symbols, rather than integer symbols.

And if we have in small database all iterations, we can simply reverse number to given block of 4 letters.



I said I will try only, not claiming it will work.

It will work only for limited number of symbols.


Same principle goes for number of colors of 1 pixel.

Digital to analog, or maybe vice versa?

nix85

that was one of first things i thought of, how many bits per letter depends on the encoding system

QuoteAn ASCII character in 8-bit ASCII encoding is 8 bits (1 byte), though it can fit in 7 bits.

An ISO-8895-1 character in ISO-8859-1 encoding is 8 bits (1 byte).

A Unicode character in UTF-8 encoding is between 8 bits (1 byte) and 32 bits (4 bytes).

A Unicode character in UTF-16 encoding is between 16 (2 bytes) and 32 bits (4 bytes), though most of the common characters take 16 bits. This is the encoding used by Windows internally.

A Unicode character in UTF-32 encoding is always 32 bits (4 bytes).

An ASCII character in UTF-8 is 8 bits (1 byte), and in UTF-16 - 16 bits.

The additional (non-ASCII) characters in ISO-8895-1 (0xA0-0xFF) would take 16 bits in UTF-8 and UTF-16.

i gave a lot of thought to extended ascii II table, each of the 8 bit characters may represent a number from 0-255, or 255-510, or 510-765, or 765-1020, or a certain function or coordinate of a cube encoding large number of variables.

https://theasciicode.com.ar/

for example 255x255x255 cube contains 16,581,375 cells and any of those cells can be addressed with just 24 bits (or even 21).

of course there is no reason to limit this to 3d cube, we could give it 30 dimensions instead so string of 30 ascii characters would be enough to address any of this many variables

1,571,105,731,713,312,715,511,913,444,948,824,285,516,982,702,388,429,082,930,088,043,212,890,625

but that approach has great limitation that it does not allow for further compression, unlike if we got a string of numbers that could be further reduced.

this is one of brute force approaches, not a solution by itself. solution might be combination of this with some complex logic and random data stream. receiver knows "what can and cannot be sent".


WhatIsIt

Yes,

You saw it well, it can not be compressed further.