Welcome to the Speedsolving.com, home of the web's largest puzzle community! You are currently viewing our forum as a guest which gives you limited access to join discussions and access our other features.

I know that any rubik's cube position can be solved in 20 moves or less, but which are the allowable moves? I'm assuming it includes FBUDRL and all of their inverses (i.e., F'B'U' etc.). What about F2B2U2...? What about fbudrl and their inverses and doubles? And what about xyz? Thanks.

EDIT: I'm writing a computer program which will take a position and recursively try every move until it finds a solution which is 20 moves or less. So if it can be done without, e.g., fbudrl, then that would reduce the number of steps by ~20^6, which would of course be hugely helpful. Hopefully, by using a minimal set of moves, the algorithm will be able to find a solution within a reasonable amount of time (i.e., within a few days).

rotations aren't needed for computer solutions. All moves are permitted in "optimal" searches. One thing you didn't mention is slice moves, I assume they're 2 moves though not entirely sure.

Like the others said, allowed moves are U, U2, U', R, R2, R', F, F2, F', L, L2, L', B, B2, B', D, D2, D'

by the way: with 6 move you don't have 20^6 possibilities, you have 6^20, which is a bit more

But because you have 18 possible moves, you have ~18^20 possible solutions. You can reduce this number by ignoring solutions like FUU2L (this is FU'L), or LRL' (this is just R), ...

Hopefully, by using a minimal set of moves, the algorithm will be able to find a solution within a reasonable amount of time (i.e., within a few days).

Like the others said, allowed moves are U, U2, U', R, R2, R', F, F2, F', L, L2, L', B, B2, B', D, D2, D'

by the way: with 6 move you don't have 20^6 possibilities, you have 6^20, which is a bit more

But because you have 18 possible moves, you have ~18^20 possible solutions. You can reduce this number by ignoring solutions like FUU2L (this is FU'L), or LRL' (this is just R), ...

Take a look at this page: compcube
Jaap explains important stuff as transition tables or pruning tables. This can speed up your program a lot.

Thanks. And ya, I meant 6^20. I thought maybe if only FBUDRL were considered valid moves, then I might be able to do it. But if the 20 total moves includes 18 different transformations then there's no way. I'll see what I can do with those tables though.

Thanks. And ya, I meant 6^20. I thought maybe if only FBUDRL were considered valid moves, then I might be able to do it. But if the 20 total moves includes 18 different transformations then there's no way. I'll see what I can do with those tables though.

The minimum number of moves required to solve the cube is zero. Theorized (and proven) lower and upper bounds for the {,} is 0 face turns, 0 rotations, and 0 slice moves. This means that the cube can be solved without F, U, B, L, R, or D turns, as well!

EDIT: I'm writing a computer program which will take a position and recursively try every move until it finds a solution which is 20 moves or less. So if it can be done without, e.g., fbudrl, then that would reduce the number of steps by ~20^6, which would of course be hugely helpful. Hopefully, by using a minimal set of moves, the algorithm will be able to find a solution within a reasonable amount of time (i.e., within a few days).

Doing a simple recursive search (breadth-first search for example) on the cube is really, really, REALLY inefficient, time- and memory consuming. Even for supercomputers! You would need not giga, not tera, not even peta, but at leasts exabytes (millions of terabytes) of memory. Breadth-first search uses memory propotional to the number of the vertices! (In this case, the number of the states of the cube.) And we didn't even talk about the running time, which is again propotional to the number of the states.

One of the most efficient way for searching optimal solutions bases on iterative deepening depth-first search, using heuristics (IDA*), prunning tables and movements tables. Look up in http://www.jaapsch.net/puzzles/compcube.htm, examine Kociemba's Two Phase algorithm, the Standard Optimal Solver (SOS) and Kociemba's Huge Optimal Solver (HOS). http://kociemba.org/cube.htm

Also there are implementations of Kociemba's Two Phase algorithm in Java, Mathematica (see links above), and I wrote it once in Matlab, based on the Mathematica code.

EDIT:
I suggest you should start with the implementation of Thistlethwaite's algorithm. Very easy, using only pre-generated lookup tables for solving the cube (in less than 45 moves guaranteed! 33 in average if I remember well.)

The minimum number of moves required to solve the cube is zero. Theorized (and proven) lower and upper bounds for the {,} is 0 face turns, 0 rotations, and 0 slice moves. This means that the cube can be solved without F, U, B, L, R, or D turns, as well!

If you can store each cube state in one byte, it would take more than 43 billion gigabytes.
I am part of a computer/math geek community and we were once trying to solve any cube optimally with a Ti-83+ calculator in a reasonable amount of time for fun, and the longest part of the program was decompressino of our extremely compressed cubestate tables and data. It takes an absolute minimum of 3 bytes to store the solved state with the variable data storage system suggested by the propser of the challenge, and the most "complicated" states took 6 bytes. WE went with a system that stored them all in 6 bytes for speed.
6*43 = 258. 258 billion billion bytes is not even remotely plausible without a massive network of the several dozen of the world's fastest and biggest (in memory capacity) supercomputers AND high-thruput networks all devoting 100% of their power towards just storing the stuff in memory.

I'm not even going to attempt to give you an educated estimate as to how long it would take for a hypothetical superduperovar9000computer to optimally search through all paths to every state. The number of cube paths that lead to states that are even 15 (not to mention 20! that's 5 moves extra!) turns away from solved hasn't even been calculated, and I'm guessing becuase that number would be gargantuan.

Assume a look-up table, where you can find every cube state's optimal solution. So, the table has 4.9*10^19 rows; and assume that for a given cube I can immediately tell you it's number in the table. Also assume that one look-up requires no time.

So, here is a scramble R U F' B2 (...) R, let's say it's number is 12218. Then the 12218-th element of the table leads me to the optimal solution of this particular scramble.

And here is the fun question: each row can be stored only in two bits! (Yes, 2 bit, or 1/4 byte). So the full table requires "only" 2*4.9*10^19 bits, which is nearly 9.4 exabytes or 9.8 million terabytes. How?

Note: In theory, such a table is easy to construct, as well as the "cube state to number" function. Only the computational time would be astronomical to construct the table, not to mention the memory storage. But once we would have such a table, it would be easy to search up any scramble's optimal solution immediately.

True. log[SUB]2[/SUB](3) to be precise (so the table requires about 7.4 exabytes). But I don't know how can you manage it computationally / algorithmically.

True. log[SUB]2[/SUB](3) to be precise (so the table requires about 7.4 exabytes). But I don't know how can you manage it computationally / algorithmically.

There's Arithmetic coding which I think would give you the optimum, but I don't know how complicated random access is. More realistically, you could cover five states in one byte which would be 1.6 bits per state. Or 111 states in 22 bytes, etc:

Code:

states bytes bits/state overhead over log_2(3)
5 1 1.6000000000 0.0150374993
111 22 1.5855855856 0.0006230849
217 43 1.5852534562 0.0002909555
323 64 1.5851393189 0.0001768182
429 85 1.5850815851 0.0001190844
535 106 1.5850467290 0.0000842283
641 127 1.5850234009 0.0000609002
747 148 1.5850066934 0.0000441927
853 169 1.5849941383 0.0000316376
959 190 1.5849843587 0.0000218580
1065 211 1.5849765258 0.0000140251
1171 232 1.5849701110 0.0000076103
1277 253 1.5849647612 0.0000022604
3937 780 1.5849631699 0.0000006692
Python 3:
from math import *
print('states bytes bits/state overhead over log_2(3)')
minBitsPerState = 2
for b in range(1, 1000):
states = floor(log(2**(8*b))/log(3))
bps = (8*b) / states
if bps < minBitsPerState:
print('{:6} {:7} {:.10f} {:.10f}'.
format(states, b, bps, bps-log(3)/log(2)))
minBitsPerState = bps

But five states in one byte looks the most reasonable as it's simple and gets pretty close to optimal already (less than 1% overhead).

There's Arithmetic coding which I think would give you the optimum, but I don't know how complicated random access is. More realistically, you could cover five states in one byte which would be 1.6 bits per state. Or 111 states in 22 bytes, etc

I found a paper entitled "Random access decompression using binary arithmetic coding". Unfortunetly it's not free to read, but based on its title and abstract, it could be the solution to the log[SUB]2[/SUB]3 storage.