Updated: 9/21/2006; 6:12:24 AM.
Nick Gall's Weblog
[NOTE: I have moved. My new blog is ironick.typepad.com.]
        

Friday, August 08, 2003

Think orgasmically.
Saw this post entitled Think orgasmically on Scripting News and it reminded me of an epiphany I had during a series of insights:

 

Epiphanies are functionally similar to orgasms in that both are emotional "rewards" for "connecting."

 

In the case of orgasm, the connection is sexual. While in the case of epiphany, the connection is conceptual. That is, my epiphanies occur when I connect concepts. Heidegger would note that both forms of connection share the same Latin root: copula—to link. <grin>

 

Of course, not every conceptual connection triggers an epiphany. Dave Winer says he has them when he has an idea that mixes well with the world. My epiphanies are triggered when I make a broad set of connections between concepts that had seemed unconnected and such connections lead to a whole series of insights.

 

Why do both ways of connecting (sexual and conceptual) trigger an emotional feeling of pleasure and satisfaction? I believe it is because both forms of connecting are essential to evolution, i.e., genetic and memetic.

 

BTW, I disagree with Dave's apparent suggestion that mostly women understand this relationship between sexual and conceptual connection. Google epiphany orgasm and you'll find many men observing the connection between them. When I did so, I found at least one man who has apparently thought about this extensively: Neil Greenberg, Professor of Ecology and Evolutionary Biology in the College of Arts and Sciences, University of Tennessee, Knoxville. His web site has interesting pages on epiphany as a state of consciousness triggered by connecting. Here's one other man who's made the connection.

 

My four greatest epiphanies (roughly spaced by decades) were triggered by the following (in order):

·         The realization of finality of death and my embrace of atheism

·         Derrida

·         Rorty

·         The realization of the fundamental importance of conceptual connection in the evolution of complexity

 

I hesitated to post this because it sounds so wacko, but given how many others have commented on the phenomenon, I felt I had to add my perspective on the issue.


1:56:01 PM      

Complexity is Relative.
I have been trying (like lots of others) to better understand complexity. And it struck me that we often speak of a given object as being both simple and complex. For example:

  • My TV is simple to operate, but really complex to set up.
  • This application is simple to install, but complex to upgrade.

To investigate this insight further, I did what I always do: Google. I googled "complexity is relative". One of the first results was this amazing paper entitled: Entropy as a fixed point. Apparently it was just published in February 2003. Given my previous post suggesting that complexity evolves to maximize entropy, I was amazed to see this connection to entropy in the title of the paper!

I was even more amazed when I read the abstract:

We study complexity and information and introduce the idea that while complexity is relative to a given class of processes, information is process independent: Information is complexity relative to the class of all conceivable processes. In essence, the idea is that information is an extension of the concept algorithmic complexity from a class of desirable and concrete processes, such as those represented by binary decision trees, to a class more general that can only in pragmatic terms be regarded as existing in the conception. It is then precisely the fact that information is defined relative to such a large class of processes that it becomes an effective tool for analyzing phenomena in a wide range of disciplines.

We test these ideas on the complexity of classical states. A domain is used to specify the class of processes, and both qualitative and quantitative notions of complexity for classical states emerge. The resulting theory is used to give new proofs of fundamental results from classical information theory, to give a new characterization of entropy in quantum mechanics, to establish a rigorous connection between entanglement transformation and computation, and to derive lower bounds on algorithmic complexity. All of this is a consequence of the setting which gives rise to the fixed point theorem: The least fixed point of the copying operator above complexity is information.

While I don't understand the formal discussion (yet), I am encouraged to see someone dealing formally with my intuition that complexity is relative to the process involved with (an) information (structure). This also reinforces my intuition that process and information are a duality.


7:27:36 AM      

© Copyright 2006 Nicholas Gall.
 
August 2003
Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31            
Jul   Sep



Latest Interesting Pages Furled

Full Archive of Furled Pages

Subscribe to my Furl Archive

Click here to visit the Radio UserLand website.



Click here to send an email to the editor of this weblog.

My Latest Blog Postings

Powered by: