I've been thinking about file compression and came up with my own method. My method gives the computer all the information it needs to do processing on its own and get the rest. All and any file can be converted to about 200 bytes. To get that back to the original file, the computer will need to do a range of combinations that is measurable per file, and perhaps by measuring cpu instructions a timer for maximum time to wait to go through all combinations in the range for that file will give an idea of how long to wait. I don't know how long it will take to decompress / get the original file back from the 200 byte file. I need some help to make the program so I can experiment on different size files and test incorporating split chunks in the compression stage (about 150 bytes extra per chunk) in case smaller files take less time in general - I don't know how long it will take. The only way I can really know is if I can get a proof-of-concept - even without the timer coded into it (although it would really help when experimenting with big files). Can someone help me code this? It requires some knowledge of the file sequence of a computer and hashes, basically.. and how to measure the max time it will take to do all combinations in the range, in case your file is the last combination in the per file range. This guarantees to make a very small file, but I don't know if it is worth it in terms of how long it will take - the only way I can have a definitive answer is if someone can help me to make this program.. If you can help me out, I would really appreciate it. I can describe how the entire thing works in full detail if anyone is interested in helping to make it for experimentation.