# Square-1 shape probability distribution

Discussion in 'Puzzle Theory' started by Lucas Garron, Sep 3, 2008.

Welcome to the Speedsolving.com. You are currently viewing our boards as a guest which gives you limited access to join discussions and access our other features. By joining our free community of over 30,000 people, you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today!

1. ### Lucas GarronSuper-Duper ModeratorStaff Member

In the WCA forum, here and http://www.worldcubeassociation.org/forum/viewtopic.php?f=4&t=449, Prusak's lucky scramble brought up the issue of fair scrambling.

In the same way that it makes most sense by solving into cube shape, and then within, it's most logical to consider scrambling as random cube-shape and random permutation of pieces in the cube shape.

Permutation of the pieces, WLOG in cube shape, is a relatively simple problem.
However, the shapes are a funnier issue. There are two ways to define "random cube shape," if you consider only the piece arrangements on top and bottom. AUF, AUD, and middles don't really matter, and can at worst just be dealt with at the end of a scramble.

So, you can consider a "random shape" to be a random choice of one of the 170 configurations. Problem is, why not choose among the 90 shapes that you'll count if you consider shapes that are z2-flips of each other as the same? The distribution is now different.

I think the "best" way to do it is to improve our current idea of scrambling: Do random moves for a really long while (rather, do something that produces equivalent scrambles). The current scrambler doesn't nearly scramble to an even distribution, though.

So, I took the following definition: In any state, the top position can be turned a certain number of ways to realign a twistable crack, and you rotate to a random one of each. Same for the bottom. Then you do a /, and continue. Over time, the probabilities even out. Permutation of pieces within the shapes becomes, random (I'm pretty sure, but I don't want to think of an argument, and I don't even want to think about what happens if it isn't).
The shapes eventually reach a distribution that doesn't change.

Some of you will probably recognize this as a random walk of the Square-1 shape graph. Each shape has one edge connecting to another for each way to get from it to the other (you can consider it undirected).
So, I have a 170x170 matrix (symmetric). Each row corresponds to a shape, has a sum of 1, and each entry is weighted by the number of ways it's possible to get to the column's shape. So, pretend you know how Markov stuff works, raise the matrix to a sufficiently high power, and you get a stable distribution.

More important stuff
So, after that long introduction: As far as I know, no one has ever actually calculated the probability distributions of the nodes in a random walk of the Square-1 shape graph. So I thought I'd do it.

I tried to take the most justifiably natural way to calculate the distributions, and computed everything in Mathematica. The results are at http://cube.garron.us/sq1/sq1graph.csv. If you want to look at the shapes, I've rendered them and put them in a zip.

So, if we're ever going to do random-state scrambling for Square-1, shapes should probably be chosen by weighing according to this distribution. (Or we could do something like generating 1000 random moves, and sending it to Jaap's solver to be optimized.)

So, I've posted here to see what other mathematically inclined people think about it. Are there good arguments for another scheme? Does anyone actually know about how random the pieces should eventually be within the shapes? Does anyone really care about the eventual shape distribution?

I can also give you the Mathematica source code and a more explicit method definition if you want.

301
0
Jul 17, 2007
WCA:
2003VAND01
jazzthief81
This is very interesting.

I wrote something a while ago that builds up the Square-1 shape graph with all the probabilities of each transition and renders them graphically. I was planning on doing something similar as you did by empirically working out the probability distribution by generating many random paths in the tree. I never got round to doing it.

So yes, I was certainly interested to find this out and I'm glad you did

799
0
Apr 27, 2008
Ohio, USA
WCA:
2006MERT01
4. ### qqwrefMember

7,830
27
Dec 18, 2007
a <script> tag near you
WCA:
2006GOTT01
qqwref2
Hi, as for fractions, here are the decimals found in the csv file and their fractional equivalents:

0.00108754758020641 = 4/3678 = (2/3)/613
0.00135943447525802 = 5/3678 = (5/6)/613
0.00217509516041282 = 8/3678 = (4/3)/613
0.00326264274061924 = 12/3678 = 2/613
0.00435019032082564 = 16/3678 = (8/3)/613
0.00652528548123848 = 24/3678 = 4/613
0.00870038064165130 = 32/3678 = (16/3)/613
0.00978792822185770 = 36/3678 = 6/613
0.01305057096247696 = 48/3678 = 8/613

The interesting thing here is the 613. These are all nice small fractions divided by 613 and I really wonder where that number comes from, since it is a rather large prime.

Oh yeah, and very nice work Lucas

5. ### Lucas GarronSuper-Duper ModeratorStaff Member

Thank you, although I think it remains to be seen how nicely.

Yeah, I have no idea. I think it's true, because I think of each backtracking as going down another random walk that still evens out, but that's so non-rigorous, it just might not be true.

Thing is, I'm turning both the top and bottom randomly, which allows (0,0). I consider that somewhat artificial because we should allow any random turn, even the null turn (Compare: a random state could be the solved state).
How about (6,6)? Do we throw that out, too? Sometimes, you get yet other ways that bring you back to a the previous shape, without backtracking, or even the same shape.
I decided not to go for the headache and deciding (and later defending) subtle issues, so I just ran it straight, with each way to get from one shape to another (or the same ) counting, equally.

http://cube.garron.us/sq1/square1graph.nb
(First time I tried adding sections, etc. It makes the notebook look so much more finished. )

Also, sq1graph.csv now lists distances, for convenience.

6. ### cuBerBruceMember

912
4
Oct 8, 2006
Malden, MA, USA
WCA:
2006NORS01
cuBerBruce
From Jaap's puzzle page, the number 3678 (the actual lowest common denominator, of which 613 is a prime factor) comes from counting the shapes in a way that also takes into account the number of ways a shape can be split into two halves. This gives a count of 37*1 + 62*12 + 46*46 + 12*62 + 1*37 = 3678. (Whereas the normal way to count the shapes is 5*1 + 10*3 + 10*10 + 3*10 + 1*5 = 170.)

7. ### JBCM627Member

799
0
Apr 27, 2008
Ohio, USA
WCA:
2006MERT01
Yeah, sections do

The notebook does look good. I couldn't eval everything due to compatibility issues; I'll have to look at it again later when I have more time and a different computer. 6.whatever I have can't import combinatorica, it seems. The fun stuff at the bottom is especially nice, and useful too.

I guess I'll agree with allowing "null" turns too. Since the square shape is a first-order node, I'm not sure if you could even disallow backtracking.

The only potential improvement that stands out to me is using "MatrixPower[N[adj], 10000]". This is what I meant by exact vs approximate values earlier... iterating 10000 times sort of defeats the point of using markov chains. Instead, this should just be diagonalized and the diagonalized matrix considered as the limit goes to infinity... all eigenvalues less than 1 will disappear, hopefully leaving you with a limited case.
In pseudo-mathematica notation:
Pinv = Inverse[P];
Dinf = Limit[Diag^n, n->Infinity]; (yes, this is done correctly... don't use matrixpower)
Leaving you with,
Probabilities = P.Dinf.Pinv

8. ### blahbrah

Dec 30, 2007
Cantucky
WCA:
2009LIAN03
blahcel

PS: No offense, I'm a geek myself

9. ### Lucas GarronSuper-Duper ModeratorStaff Member

Thing is, matrix^n uses matrix multiplication, and MatrixPower uses dot product, which I think is what we want.
The limit fails on your Dinf (I end up with infinities all over, which makes Probabilities a 170x170 matrix with each entry infinity). But if I I raise it to some sufficiently high power instead of taking the limit, the diagonal of Probabilities is correct. The same probabilities, and with no more an effective computation.
And the limit of MatrixPower[adj, n] doesn't work so well, either (I think it'll eventually give the correct answer, but it's really slow).

Basically, if we're going to N[], like you did, MatrixPower to the 1000th works well enough. But do tell me if you find code that will produce the exact fractions from a reasonably computable limit.
Right now, I'm going to try to diagonalize better, but since we already have the exact fractions, I don't think it's such a big issue.

P.S.: I really need to get more eigencomfy with linear algebra.

Last edited: Sep 5, 2008
10. ### JBCM627Member

799
0
Apr 27, 2008
Ohio, USA
WCA:
2006MERT01
What version of mathematica are you using, Lucas? I have access to 5 on the physics computers here, if you are using that... and will try this tomorrow. Otherwise, I'll have to find a way to get an earlier version...

Edit: You can use Diag^n, as it is a diagonal matrix; it is a special case, which is why it is so useful.
Short example:
N^x = (P.D.P^-1)^x = P.D.P^-1.P.D.P.... = P.D^x.P^-1

Try Eigenvalues[N[]], and see if any of those are over 1... they should all be less than or = 1.

Last edited: Sep 5, 2008
11. ### Lucas GarronSuper-Duper ModeratorStaff Member

Yeah, I know about diagonalizing, but maybe I don't know enough about other eigenstuff. Anyhow, I think the issue is with Mathematica handling stuff, I'll try it again when I get version 6.

So, through dinner, I set my computer to compute the explicit, exact eigenvalues of the transition matrix. The results were so surprising, I actually laughed out loud at their beauty:

Of the 170 eigenvalues, exactly 90 are non-zero. That is certainly not a coincidence; there are 90 shapes, and hence 90 unique rows. (From minors and determinants, I can imagine that you get just the right amount of zeros. but it's still pretty, and some time I'd like to fully understand why.)

Anyhow, the 90 non-zero eigenvalues are of three forms:
• 1 (The 1 root of 1-x)
• The 64 roots of - 4991442219 + 386948115252*x + 138762924658242*x^2 - 3103222624916364*x^3 - 877640411309612746*x^4 + 6767666047147902260*x^5 + 2012337578608379698422*x^6 - 8958764174373753804464*x^7 - 2105450969798079776924812*x^8 + 11046600222902203991043312*x^9 + 1123406093781088176157756896*x^10 - 8114779285022492769604795424*x^11 - 325472049644439264188809738432*x^12 + 2962546562996861776481954741760*x^13 + 56287388574936353280108941504768*x^14 - 629949208560303852773245517141504*x^15 - 6105099366584694343157749747552256*x^16 + 86874786230163907381580006692296704*x^17 + 406640998126424132369607623831846912*x^18 - 8273097635823613171991383746279940096*x^19 - 12847215092645959516368232004260052992*x^20 + 563542116540418623958067303673961512960*x^21 - 380249727800812172360337111385392087040*x^22 - 27876394040223235000410822110796320866304*x^23 + 70200985683043152366609324564260935958528*x^24 + 995061174735599595418926716008043747737600*x^25 - 4377190735910344574289281403129530741161984*x^26 - 24483749753436884219275245277429039028502528*x^27 + 172693480479595766346198778149152449118601216*x^28 + 344268199358502700321288117865500491022073856*x^29 - 4765625577836958680705390357889539456775487488*x^30 + 772858710665023800732181764002139803146518528*x^31 + 93400462329952268786337346115591965649619910656*x^32 - 173918077700241084337687739506139579202876211200*x^33 - 1244927272170148943873781598261313236445492674560*x^34 + 4715876999443357796546772004551277481760550551552*x^35 + 9204457647103604719171194562707870353618733367296*x^36 - 74472880131178952274776792185662826546928673619968*x^37 + 17443350194669512509483766280210174159010360459264*x^38 + 751913910049235990870614175455332842992117179482112*x^39 - 1392779124922777607824049090376214683191153824628736*x^40 - 4328092041600080883386980019497279165272616983330816*x^41 + 18110807337525247976881146547787829180504891067465728*x^42 + 2644680623656263536542849631837447260694572425019392*x^43 - 124957758646564498523954850327436181770010773552103424*x^44 + 181485548821389832364276808943176572906247959082958848*x^45 + 406331124721305918838666283081528680333446852510744576*x^46 - 1548126628345090784471770960023044689718520670775672832*x^47 + 549713325037757110937937164284329784848484888731451392*x^48 + 5552249583340000587865342011734478531902726023208239104*x^49 - 10580714260899344013132135083176015760588301879893557248*x^50 - 2510540104530940516126750428486043682581121309979181056*x^51 + 35729814707763782643150469325553615343782749037989462016*x^52 - 47192725438731643771622400943584186952215159805015228416*x^53 - 14671327554181472737539318334576376054183452695928504320*x^54 + 121677626730594369246671048452427099031448501592531140608*x^55 - 153528427155205489249202015529459568317970698252614369280*x^56 + 31072468697838310383691091990364643611438025213753688064*x^57 + 166184701748611768310167010069620415713246037605028462592*x^58 - 278996494557677233506937637885590804628898472557432274944*x^59 + 244873692347800039598793398534385002894376872578960588800*x^60 - 136778495367914907214791777149024478324242970827210883072*x^61 + 49139518222010279228071785788096857437263861359797338112*x^62 - 10462695095295681452359646651435512209985692421953945600*x^63 + 1011159794444683308147509475038062924991905844806287360*x^64
• The 25 roots of -41864 + 3136179*x + 386613206*x^2 - 16868236200*x^3 - 402432346544*x^4 + 21212854848768*x^5 - 58429160850880*x^6 - 5476338956938432*x^7 + 47709591633416832*x^8 + 458983451386437120*x^9 - 6981888874709569536*x^10 - 1163160407514488832*x^11 + 392768286252547276800*x^12 - 1520451906669906886656*x^13 - 7565285380895954436096*x^14 + 68553662577750099099648*x^15 - 78233310032907038883840*x^16 - 913347325546567116521472*x^17 + 4118199373346723755720704*x^18 - 3435110820984016104062976*x^19 - 24441765912940143332818944*x^20 + 99523299112333703179665408*x^21 - 183592589424988418224422912*x^22 + 191248395143239778961457152*x^23 - 108940123945387453071753216*x^24 + 26498949067796948044480512*x^25
Okay, someone explain that.
(Note: This makes 90 roots: 1^2+5^2+8^2. Why squares?)

There is a lot of fascinating math about Square-1.
There's those equations, and I have no idea about the coefficients yet.
And I still don't know why the 613 applies to the eventual distribution. Maybe there's something to do with shape symmetries?

12. ### Lucas GarronSuper-Duper ModeratorStaff Member

Okay, I think I can explain the probability distribution. Apparently it's mostly what I suspected.
Each shape is weighted (probability/3678) by the its degree in the full graph divided by its AUF/ADF symmetries.

So, it perfectly corresponds to the 3678, and Bruce's explanation is half of it.
Consider each shape the Square-1 can have, such that it's currently /-twistable (there's a crack going around the entire middle, and the 16 edge and corner pieces are essentially four half-sides). Each of these 3678 shapes has exactly the same probability. (If you want to consider the middle two pieces, divide the probability in half for its two possible states. But really, we can ignore the middle for most of this.)

This distribution is stable. Consider 3678 Square-1s, each in one of those shapes. The distribution of the "170" shapes is given by the probabilities I found.
Now, imagine giving each of these a / twist. The 3678 shapes will be permuted randomly (2-cycles), but each of them is still represented and each of the "170" shapes still exist with the same distribution. Performing a random AUF&ADF of each does not change the distribution, as each AUF&ADF of a shape are represented equally already, and will go to each other with equal probability.

With a little insight, it's intuitive that the distribution is stable, and it's not too hard believe that it's unique and all random walks will approximate it.

So, I think the "natural" shape distribution is resolved. But thanks to cuBerBruce and JCBM, without whom I probably wouldn't have found it, and qqwref, who specifically helped me think through the final "solution."

So all that only (yeah right, "only") leaves me to wonder now why the 90-degree characteristic polynomial factors into 3 polynomials with 1, 25, an 64 roots. (Which root corresponds to which shape? Does that even make sense to say?)

13. ### JBCM627Member

799
0
Apr 27, 2008
Ohio, USA
WCA:
2006MERT01
I'm having troubles getting it to diagonalize as well. It looks like it is just due to small decimal errors in mathematica though; P.Pinv isnt even giving me I, but some other matrix filled with small floating point errors. The code I posted earlier at least works for the distribution problem Stefan posted in the WCA forum, so I don't think diagonalizing like this is a problem; either something in the initialization is causing it to calculate approximately instead of exactly, or Mathematica just doesn't want to do it (I'm on a cruddy computer right now.. it took a few minutes just to open mathematica :/). I wish the other lab were open now; but I give up and will accept approximations til I can get in there.

The probability distribution is quite interesting. I was able to generate that polynomial (well, more or less... its giving me an imaginary part right now too and won't let me Re[]). It therefore also is having trouble factoring. As to why it ends up as 3 polynomials like that, I have no idea. It is certainly at least possible to match the shapes up with eigenvalues/roots, but as to if certain shapes fall into a certain polynomial, I don't know if a pattern will emerge.

Edit:
Cool visual implementation I found (not necessarily what we are looking for, though); but something like this could be done for probabilities of shape nodes: http://demonstrations.wolfram.com/TheEigenvectorsOfARandomGraph/
Also cool (of course!): http://demonstrations.wolfram.com/RubiksCube/

And again:
Got it working in 6; it was a stupid bug... Rotate[] is already reserved (defined for some graphics thing) and I hadn't replaced all instances of it.
Package to import is just <<'Combanitorica'... I'm not sure if the correction thing replaced something else too, though. Anyhow, it looks like its working.

Yet again:
There we go. I like my computer Duh, N[]. I'll let my computer run this for a while now...

Again:
After about an hour, I'm pausing this thing. I really wish there were some way to monitor the evaluation's progress, but it appears not. I almost wonder if I could do this by hand faster, since I already know the eigenvalues. Will let it keep running when I go to bed.

And finally:
After it looks like 11 hours, I'm killing this. I don't know why its taking so long... all I did was Eigenvectors[adj].

Last edited: Sep 6, 2008
14. ### smskill12Member

lucas you should start speedcubing with the square-1 again
unless you have been

15. ### JBCM627Member

799
0
Apr 27, 2008
Ohio, USA
WCA:
2006MERT01
So, we finally covered markov chains in class, after which I hit my head, and went back to Lucas's notebook.

Finding exact values is a bit easier than I initially thought - you only need to consider the eigenvalue 1 (which is now obvious to me as this is the only eigenvalue that will remain nonzero in the limit when you raise the eigenvalues to an infinite power). This also serves to further clarify that the approximate numbers are correct:
Code:
p = NullSpace[IdentityMatrix[170] - Transpose[adj]]
This gives you relative values. So to find the exact probabilities, you will need to multiply by 1/S, where S is the Sum of your elements. Not being too familiar with mathematica, I did this in a sort of roundabout way:
Code:
f = 1/Tr[p*IdentityMatrix[170]]
So, your probabilities are simply f*p.

Code:
{4/1839, 4/1839, 4/1839, 4/1839, 5/3678, 6/613, 6/613, 6/613, 8/613, \
8/613, 8/613, 6/613, 6/613, 6/613, 2/613, 4/613, 4/613, 4/613, \
16/1839, 16/1839, 16/1839, 4/613, 4/613, 4/613, 4/1839, 2/613, 2/613, \
2/613, 8/1839, 8/1839, 8/1839, 2/613, 2/613, 2/613, 2/1839, 6/613, \
4/613, 4/613, 4/613, 6/613, 6/613, 4/613, 4/613, 6/613, 2/613, 4/613, \
8/1839, 8/1839, 8/1839, 4/613, 4/613, 8/1839, 8/1839, 4/613, 4/1839, \
4/613, 8/1839, 8/1839, 8/1839, 4/613, 4/613, 8/1839, 8/1839, 4/613, \
4/1839, 4/613, 8/1839, 8/1839, 8/1839, 4/613, 4/613, 8/1839, 8/1839, \
4/613, 4/1839, 6/613, 4/613, 4/613, 4/613, 6/613, 6/613, 4/613, \
4/613, 6/613, 2/613, 6/613, 4/613, 4/613, 4/613, 6/613, 6/613, 4/613, \
4/613, 6/613, 2/613, 4/613, 8/1839, 8/1839, 8/1839, 4/613, 4/613, \
8/1839, 8/1839, 4/613, 4/1839, 4/613, 8/1839, 8/1839, 8/1839, 4/613, \
4/613, 8/1839, 8/1839, 4/613, 4/1839, 6/613, 4/613, 4/613, 4/613, \
6/613, 6/613, 4/613, 4/613, 6/613, 2/613, 2/613, 4/1839, 4/1839, \
4/1839, 2/613, 2/613, 4/1839, 4/1839, 2/613, 2/1839, 6/613, 4/613, \
2/613, 6/613, 4/613, 2/613, 6/613, 4/613, 2/613, 8/613, 16/1839, \
8/1839, 8/613, 16/1839, 8/1839, 8/613, 16/1839, 8/1839, 6/613, 4/613, \
2/613, 6/613, 4/613, 2/613, 6/613, 4/613, 2/613, 2/613, 4/1839, \
2/1839, 4/1839, 4/1839, 4/1839, 4/1839, 5/3678}

16. ### StefanMember

7,287
13
May 7, 2006
WCA:
2003POCH01
StefanPochmann
Digging this out for clarification requests...

Do I understand this thread correctly, that this is achieved by randomly picking a position from all with equal probability? So could one for example randomly permute the 16 pieces and then put them in that order clockwise from the top front gap, then clockwise from the bottom front gap (just making sure it's actually two halves and starting over if not)? Would that represent the many-random-moves idea?

Do the results change if we use a more 3x3x3-like definition like this?

Random sequence of moves of U, R and D without trivial cancellations.

In other words, don't treat U and D differently from R in the definition. Does this end up being the same definition as yours or having the same result as yours? And if not, which definition is "better"?

How does the current WCA scrambler work? Looks like random moves, but I don't understand the code and how it chooses the turns (one of our two definitions?).

Also: The WCA-scrambled state is always /-turnable. But should it be? I find that slightly unnatural.

Why 3? Shouldn't that be 2?

17. ### Mike HugheySuper ModeratorStaff Member

8,698
272
Jun 7, 2007
Indianapolis
WCA:
2007HUGH01
MikeHughey1
Ugh. I should have searched for a thread like this when I was working on my square-1 BLD method. I was thinking it would be nice to have probabilities of shapes so I could learn them in order of likelihood of getting them, so I could be more likely to get the ones I had already memorized. But I guess I'm glad now that I didn't, since otherwise I might not have been as motivated to learn them all. I guess I didn't remember this thread because it wasn't something that interested me that much back then.

Why not 4? For my memorization purposes, a square has 4 possible orientations. It makes getting to square consistently a little tricky because I need to make sure I always orient it the same from the previous position. (I solved it by always using minimal movement - zero or 30 degrees.) Maybe I don't understand what "orientation" means in this context.

(And by the way, I hate the lack of ability to automatically get quote trees with the new site for cases like this. Stefan's "Why 3? Shouldn't that be 2?" quote is completely meaningless without the preceding quote.)

18. ### qqwrefMember

7,830
27
Dec 18, 2007
a <script> tag near you
WCA:
2006GOTT01
qqwref2
I would like to do this, but I don't know a reasonably efficient solving algorithm, except for Jaap's program which uses many MB of tables and doesn't have any graphical interface. Is there one? Could someone make one? Even a program which uses many MB of tables could work fine if it could be prettied up in CubeExplorer fashion, because we already use one external program to make scrambles and it wouldn't really be too much to ask to have another.

I don't know about results, but there is a fundamental difference between U/D and R in the Square-1 (as it is a dihedral puzzle, not a symmetrical one where all layers work equally) so it would be inappropriate to treat the layers equally when scrambling. As for "better", well, see above - we should be trying to move towards a random-state scrambler.

I don't find this unnatural. You wouldn't provide a 3x3 scramble with the top layer turned 45 degrees and say that it's fine since two layers can be turned. Even though the Square-1 is bandaged, I think the same reasoning makes sense: we can always use positions that allow all layers to be turned, so we should, because other positions are between the kind of real states that we would be able to see during a non-canceling move sequence.

19. ### StefanMember

7,287
13
May 7, 2006
WCA:
2003POCH01
StefanPochmann
There's one here now:
http://83.169.19.211/sq1/

Don't know how it works and how much resources it takes, though.

Why? For example, what is wrong with my above definition which does treat them equally?

I now think it does differ, at least compared to the WCA scrambler. I just tested the WCA scrambler a bit, did this three times:
- Generate 10 scrambles of 999 moves to get a lot of data
- Remove (0,0) from the ends cause I don't want to count that
- Count (0,y), (x,0) and (x,y)

The results:
First attempt: 1211, 1207, 4138
Second attempt: 1184, 1159, 4113
Third attempt: 1212, 1166, 4124

So (U,D) turns with U or D being 0 consistently occured significantly more than half the time.

Now consider my scramble definition: random non-canceling sequence in <U,R,D>. It can still be written in the current WCA notation, of course. Imagine we just had an R turn. Next must be a U or D turn, let's say it's D. Then next must be a U or R turn. I see two options to weigh them:

- Don't weigh them, they have equal probability. Half the time it's the R and we get a (0,D) or (U,0) turn. So the (U,D) turns with U or D being 0 should occur half the time.
- Weigh them according to how many possibilities each layer has. R only has one, U probably has several. So the (U,D) turns with U or D being 0 should occur significantly *less* than half the time.

Since the WCA scrambler appears to have them significantly *more* than half the time, not half the time or significantly *less* than half the time, we seem to differ.

Absolutely. But my question is whether my scrambling-by-turning definition results in a different probability distribution, affecting how the random-state scrambler picks the random state.

Imagine someone who's not already biased by knowing how to use the puzzle. In other words, a non-cuber. I'm quite sure they align the 3x3x3 properly, but not necessarily the square-1! They might very well end up with a U turn only resulting in a UF gap but not a UB gap.

Last edited: Sep 20, 2010
20. ### qqwrefMember

7,830
27
Dec 18, 2007
a <script> tag near you
WCA:
2006GOTT01
qqwref2
Neat. But it's php based so there's no way to check whether it works (or how well it works) without hacking. Maybe he has some random-state code feeding into Jaap's solver and it's all run on a server somewhere.

I see, the third number is the first two PLUS the (x,y) with nonzero x and y.

I looked at the code and it uses Jaap's scrambler which I do not think is the best one. It has some weird biases and the functioning is pretty much opaque. Of course it is pretty much impossible to get any new scrambler past the WCA so that will have to do. The scrambler in qqtimer is better and functions as follows:
- Choose a random move of the form (x,y), with x and y chosen completely randomly. If x=y=0 (unless it's the first move) or the move is impossible, try again. If we don't have enough remaining moves to do this, try again.
- After performing it, if we have at least one remaining move, do a /.
- Repeat.

We differ because I initially thought the official scrambler used the good scrambling algorithm, which it doesn't. I also assumed that when you said the U/D/R layers would be treated equally you were essentially forcing the first weighing. I think the second weighing is a good way to go, if we must generate the moves randomly.

I don't think we should ever go by what people who don't "know[...] how to use the puzzle" think. If you think we should, why not just disassemble the puzzles and put them back together? Or, even worse, why not take off the stickers and reapply, for our scrambles? I prefer to go by the logical route of mixing up a puzzle as close to uniformly randomly as possible, without getting in the way of its proper functioning. Having a Square-1 in a position where / is impossible does not seem any more reasonable than having it halfway through a / move, or having a 3x3 off by 45 degrees, or having a 5x5 halfway through a V-lockup.