Log in

cau yue wu yi shi
21st-Oct-2005 12:55 pm

I haven't read the whole section yet, and perhaps I won't understand it all when I do, but it looks the problem with the axiom of choice may be summarized in the idea of 'interpenetration'.

The problem as I see it is, you can pick 1 object from infinitely many bins, but if some bins overlap (as some definitions or sets of defined numbers may sometimes overlap) you end up with many bins, not empty, but sharing '1' object - therefore non-empty, but, between them, they should only have a 'fractional' portion to give, if they were to each give equally to choosing function. So, if you have an 'infinite' set of bins, but a finite set of objects, for instance, you run into a problem, unless the 'portion' of the 'give' of each object is run down to a limit or something.

Other ways to solve the problem might involve 'recursive' or fractal type phenomenon, things that take into account holism and self-similarity, so that reproduction of the same thing again and again doesn't occur - reuse of functions with 'function libraries' and function calls in programs might be an example of this - I mean, something existing only once in a 'direct' form and having other instances occuring as 'links' or 'reflections' of the original. (Note Indra's Net for further info on self-similarity, and Fritjof Capra's "Tao of Physics" for other metaphors created to show similarity between intuitive and rational views of the universe.)
Another example might be the idea of 'templates', where other instances are in fact 'transformations' of this template, the transformations not being inherent in the object, but for instance, stored separately, using the 'symmetries' of/between a given template and an object in order to 'compress information'. (something most useful with a complex 'image' type scenario, where only minor changes occur, so mapping the change as a function symmetrical to the original saves on 'raw data' at the expense of additional 'processing').

An example showing this effect might be the idea of decreasing probability of primes occuring as one progresses towards ever larger numbers. ie (2*3*5*7*11*13*17*19...*n)^-1 = probability of there being primes greater than n ... if you instead added 1/2 (probability of number being even) + 1/3 +1/5 +1/7 etc, you would end up with the overlapping probabilities added in, ie even numbers divisible by 3 included in 1/2, again when you consider 1/3 of numbers being divisible by 3 .
I hope I'm making sense, I'm not a math wiz, just 'math interested' -
I sort of decided to randomly drop a post, since I got caught up in trying to learn some math stuff in wikipedia, and thought there might be something I might actually have a relevant idea about. I'll cross link this post with my blog. Might remove later, depending.

23rd-Jun-2009 06:46 am (UTC) - I have found interesting <a href="http://www.design21sdn.com/people/28796">source</a> and would like
Thans for your information?
12th-Jul-2009 09:55 pm (UTC)
probability of primes greater than n
should probably be phrased more accurately as 'population density' of primes greater than n... but these two things are related...
This page was loaded Nov 25th 2015, 8:42 am GMT.