Saturday 31 December 2005

Rusty Links

Rusty Chain

I took this last week of the year off to organize my things, cause I will probably move to UK next year to begin a work applying Statistical Physics to Cryptography. I was cleaning my wardrobe, which does not have only clothes but also books, old toys and a lot of papers. Then I found a folder and a notebook covered with some dust and with the title "Internet" (time goes by very fast...). I opened then and found a lot of links that I probably found interesting someday and I decided to put them in my del.icio.us page. I begun this task this morning and they are really interesting links. So, I decided to put them here for everyone who read this also have the chance to take a look at them. Let me list them in the exact order that I found them:

Sodaplay: classic. Everyone must know it by now, but I´m glad to have rediscovered it.

Jim Loy´s Homepage: the amount of information about science-related themes is huge. How this guy manage to write so much is a mistery to me.

2d Curves: a collection of mathematical bidimensional curves.

Igor Nikitin´s Homepage: has an interesting document on String Theory.

Kolmogorov: a very complete page about Kolmogorov.

Wilfrid Hodges´ Homepage

Douglas Arnold´s Homepage: with an interesting page about Some disasters attributable to bad numerical computing.

The Online Books Page

O Cerebro Nosso de Cada Dia: a site about the brain originally in Portuguese, but with an English translation as Our Daily Brain.

That´s it. Probably I will find more as my cleaning proceeds and I´ll put them here.

Happy New Year for everyone!

Picture: Rusty Chain by Hilly Wakeford

Thursday 29 December 2005

Phononic Crystals

Phononic Crystal

I found an interesting article in Physics Web by Gorishnyy et al. named Sound ideas (the beautiful picture above was taken from this article). The article talks about a kind of crystal, named a phononic crystal, that can be constructed in such a way to create specific "band gaps" for waves travelling in this solid. This means that you can control which frequency cannot propagate in the crystal, creating, for example, materials that become isolators for particular sounds or mechanical waves.

The band gaps are created by carefull design of the crystals allowing a control of the dispersion relation, the relation between frequency and wave number, in phonons, which are quantized modes of vibration in a solid. This quantization of vibrational modes comes from a treatment using the machinery of quantum mechanics and is a very important mechanism that, among other things, influence the heat condictivity of materials. For a short introduction to the theory of elementary excitations in solids see Elementary Excitations in Solids : Lectures on Phonons, Electrons, and Plasmons by David Pines.

Phononic crystals may have a lot of interesting technological applications described in detail inside the article, in the words of the authors

"Phononic crystals will provide researchers in acoustics and ultrasonics with new components that offer the same level of control over sound that mirrors and lenses provide over light."

Over my desk:

Friday 23 December 2005

Cosmic Collision

Galaxies NGC 2207 & IC 2163

A friend send me this link named Cosmic Collision this week. It is a subsite of the official Hubble site where it is described how it will look like the collision of our Milky Way with the Andromeda Galaxy. The site shows the story with a narrated video and have a lot of scientific explanations in a simple but precise way.

The Milky way is indeed colliding already with other minor galaxies of our local neighborhood in our trip in the direction of Virgo Cluster named the Local Group, like the Magellanic Clouds, but the collision with Andromeda will be much more espectacular due to the size of Andromeda. Our planetary system probably will not be affected due to its tiny size relative to interstelar distances, but in the site they show how the night sky will look like during the collision time. In the end, both galaxies will merge into a large elliptical galaxy.

The collision will occur in about 5 billion years from now, what remembered me of a story someone told me once (I don´t remember who...): A scientist was giving a lecture about the death of our sun. At some point, a person raised a shaking arm and asked in a trembling voice Excuse-me, professor, when did you say that will occur?. The professor answered In about 5 billion years.. The guy then took a deep breath and said in relief Oh... I thought you have said 5 MILLION...

As a last comment, the Hubble site has a lot of beautiful pictures and interesting explanations. Don´t be in a hurry when navigating there and you will enjoy every mouse click.

Over my desk:

1. Deriving Landauer’s erasure principle from statistical mechanics, Jacobs (quant-ph/0512105).

2. Spin Glasses: a Perspective, Sherrington (cond-mat/0512425).

3. Projective geometry and special relativity, Delphenich (gr-qc/0512125).

4. The Study of the Pioneer Anomaly: New Data and Objectives for New Investigation, Turyshev (gr-qc/0512121).

5. Quantum information and computation, Bub (quant-ph/0512125).

Picture: Colliding galaxies NGC 2207 and IC 2163, NASA.

Tuesday 20 December 2005

Geometric Algebra



The algebra of complex numbers is related to geometry by the Argand plane. Using it, we see that the operation of multiplying by i is equivalent to a 90 degrees rotation in the counterclockwise direction. A little more advanced concept is that of quaternions, that as complex numbers, are a set of numbers that can represent rotations in 3D space. In both these cases, there is a beautiful connection between algebric structures and geometry that can be used to express physical laws in a concise way.

The notorious way to use geometry in physics is by means of Gibbs' vector calculus, which became widespread in physical sciences and engineering. In 1878 Clifford created a structure with the name geometric algebra uniting the dot and the cross products of two vectors into a single entity named the geometric product, which for two vectors a and b is written as
\[ab=a\cdot b +a \wedge b,\]

where the first term is the dot (scalar) product and the second the wedge or exterior product, which generalize the cross product that turns out to be a particular case in 3 dimensions.

Although it has a lot of applications in physics, it was eclipsed by Gibbs' vector calculus and was forgotten untill 1960 when David Hestenes, trying to recover the geometric meaning of the Clifford algebra related to spin discovered that geometric algebra is a "universal language for mathematics, physics and engineering."

There are a complete introductory course as Lecture Notes in the site of the Department of Physics of the University of Cambridge.

The interesting fact, that my former PhD advisor pointed me, is that there is a hope that this structure can lead to a geometric interpretation of the misterious use of complex numbers in Quantum Mechanics. However, I need to read more the lecture notes to talk about that.

Papers over my desk (or in my desktop):


  • Vegetation's Red Edge: A Possible Spectroscopic Biosignature of Extraterrestrial Plants - Seager et al. (astro-ph/0503302)
  • Causal Sets: Discrete Gravity (Notes for the Valdivia Summer School) - Sorkin (gr-qc/0309009)
  • The General Quantum Interference Principle and the Duality Computer - Long (quant-ph/0512120)
  • Entropic Priors - Caticha and Preuss (physics/0312131)
  • On Math, Matter and Mind - Hut et al. (physics/0510188)
Picture: Quantum Notions - Gerard von Harpe

Wednesday 14 December 2005

A Matter of Doubt

Black Hole

In the most fundamental level of nature lies two concepts that are central to physics: energy and matter. Energy is a fundamental entity present everywhere. Even empty space contains energy (what renders the term "empty space" a little inaccurate).

Matter is a concept directly associated with mass. Matter particles are particles with mass. Mass started as two "different" quantities: a measure of inertia, what comes from Newton´s formula
F=ma
and gravitational charge, again given by Newton as
F=GmM/r^2
where the gravitational constant G is so small that renders gravity the weakest of all forces in nature. Although nothing in principle says that gravitational charge and coefficient of inertia should be the same thing, Newton already confirmed by making experiments that both concepts agree with great precision. This point was late clarified by General Relativity, where we learned that gravity is only a deformation in spacetime and what we see as an atractive force is just a geodesic path, but the detailed explanation can be found in, for example, Robert Wald's General Relativitybook and in Sean Carrol's Website under the title Lecture Notes on General Relativity, so I will postpone it for a future post.

Mass is known to be equivalent to energy since Einstein´s Special Relativity. His famous formula
E=mc^2
which is valid for a body AT REST, means that even objects with no movement and subject to no forces have some energy that can be extracted from its mass. Indeed, the atomic bomb relied on this formula to produce an amazing amount of energy from a relatively small piece of matter.

The making of the atomic bomb shows that to extract energy from matter is (relatively) easy, but the converse is not so. The main problem is that we still does not know what exactly is the mechanism that converts energy into matter. We have clues, both experimental and theoretical, but a complete explanation is still lacking.

In the first place, we expect that energy can be transformed into matter because we believe that in the beginning there was only energy in the universe and, somehow at some point in the far past, this energy gave birth matter particles. Second, we know that it can happen because there are experimental evidence for a phenomenon called pair creation, where a photon acquires sufficient energy and generates a positron and an electron. However, this is totally random and we cannot predict when and how this will happen.

There is a curious theoretical phenomenon called Unruh-Hawking Radiation, sometimes treated separetely as Unruh Effect and Hawking Radiation, which is related to matter creation too. It is theoretical because we can deduce it from quantum mechanics and relativity, but the effect was not observed experimentally at this moment. Hawking discovered that black holes can induce production of pairs of matter particles around its event horizon and the emnission spectrum of these particles is a black body spectrum with temperature
T=\hbar g/2\pi ck_B
where g is the local gravity acceleration. The equivalence principle of general relativity requires that a gravitational field is equivalent to acceleration and this implies that an accelerated observer can see a background of matter particles where an observer at rest see only the vacuum, and this particles obbey the same spectrum distribution of the particles near the black hole with temperature
T=\hbar a/2\pi ck_B
where a is the acceleration.

The only explanation till now about how particles acquire mass comes from the Higgs mechanism, a kind of symmetry breaking involving a particle called the Higgs boson. But the Higgs boson has not been found experimentally at the moment and there is another problem: we must assume that the Higgs has a mass itself, what only puts the problem in another level: from where comes the mass of the Higgs? A self-interaction, you would say, but it´s just a circular argument, does not help too much.

The matter-energy problem has not been in the first plane of research in the last decades, but there are something very fundamental in this problem that must be understood if we want to go on with our aim of understanding how the universe works and how it appeared.

Tuesday 6 December 2005

Quantum Limitations

Quantum Foam

As I already said in another post, quantum mechanics is a wide confirmed and one of the most successful theories about nature we (humans) ever created. The agreement of predictions with experiments is amazing and there are no known experiments that contradict the theory.

However, this is not the end of the story. QM is successful for its mathematics describes nature with tantalizing precision, but the math was tailored from experiments to fit them. This means that QM, unlike Relativity, is not derived from some fundamental principle. The lack of this principle is what is behind the great numbers of alternative interpretations apart from the ortodox one, which leads to strange situations like the Schroedinger Cat.

The lack of a first principles derivation still is responsible for the existence of alternative theories that try to explain qunatum phenomena, like Bohmian Mechanics, where David Bohm tries to explain quantum behavior by a misterious quantum field that permeates spacetime, and Stochastic Electrodynamics, pioneered by Timothy Boyer, which uses classical mechanics plus a random background field of electric particles and is able to find a lot of good results. But no theory yet has been proven to be exactly equal or superior to QM.

When I talk about the success of QM, I´m not talking yet about quantum field theory (QFT). QFT arises from the merging of QM with special relativity. It has a lot of success, but the way these results are extracted from the body of the theory is very trick and most of scientists have the feeling that this should not be the final answer. Although when supplied with some experimental measurements QFT can give results that agree with experiments by one part in a billion (in QED, for example), the calculations come from infinite series expansions that do not converge. The expansions are truncated and a lot of work on renormalizing (take the infinites away) the theory must be made.

The point is that even giving the correct values for several quantities, QFT begins with a very questionable (in my view) procedure: you simply transform equations from classical physics in equations for fields, solve by expanding in Fourier series and then impose quantum commutation relations between the fields and the canonical conjugate momenta. It is a recipe. We don´t know exactly what we are doing, but we borrow the procedure of imposing these relations from plain QM and go on. Another thing: the conjugate momenta comes from a lagrangean density that is constructed in such a way that it gives the correct equations of motion, again without any fuindamental principle. This procedure works with some tricks for Electrodynamics and for Weak and Strong Forces (giving rise to electroweak theory and to QCD), but fails miserably with gravity.

In my view (and my only view, what means that it is not the current view of scientific community), without a clear understanding of what we are really doing, we can´t even be sure if we should quantize gravity. Critics of string theory say that after so much time without success, maybe string theory is a wrong way, but the endeavour of quantizing gravity is much older and we could not do it till this day. I´m not saying that quantum gravity is not worth pursuing, I´m just saying that maybe there is a tiny possibility that nature did not choose this path. However, today the probability that QG exists probably is higher than that it do not. We have to wait more theoretical results or experiments.

Just to cite a tentative for deriving QM from first principles, it is worth looking at the papers of Ariel Caticha, he is trying to show that QM can be obtained by applying principles of information theory and bayesian inference to physics. The main theory is in Insufficient reason and entropy in quantum theory (quant-ph/9810074). He is also trying to show that general relativity can be obtained from the same principles: The Information Geometry of Space and Time (gr-qc/0508108).

Picture: Quantum Foam - taken from http://www.journal-kempten.de/

Sunday 27 November 2005

Avalanches


Avalanches are physical phenomena of great interest, mainly because they represent a big risk for those who live or visit areas where this kind of natural disaster can occur. But avalanches are very complex. They arise from an instability in a pile of granular material like sand or snow. Granular materials can be piled but just until the slope of the sides of this pile is below a critical angle. When the slope is above this angle, any extra grain added to the pile can cause a chain reaction and start an avalanche. The point is that you never know exactly when the avalanche will start.

Avalanches are an example of what is called an emergent behavior. Complex systems, which are composed by a great number of interacting unities, can show exceptional characteristics that are not expected: strange organization phenomena and surprising effects.

Although the problem of modelling granular materials seems to be something easy at first sight, it is a difficult matter and to this date we haven´t a unified theory yet. There are different approaches to attack this problem. One is to try to model granular materials a kind of fluid with special properties. It is a hydrodynamical approach. The other is to build discrete toy models and analyze them mathematically.

The second approach is related to the famous Bak-Tang-Wisenfeld model of a sandpile, where they use a bidimensional cellular automaton to model a pile where at each time step a grain is added at random in some site. When a site has a slope above a critical slope relative to its neighbours, one or more grains is transferred to this neighbour. This model turn out to have a very special behavior called Self-Organized Criticality (SOC). This behavior is rrelated to the distribution of the sizes of the avalanches in the pile and to the fact that the pile has a set of quiescent states, named metastable states, where the pile is momentarily stable.

These models can be complicated or simplified as much as we want and their study is not an easy matter once they are models that should be studied out of equilibrium, and out-of-equilibrium phenomena and, once more, we don´t have a unified theory for them too. An interesting example of a simplified model where you can see "avalanches" was sent to me last week by a friend named Marlo who found it in the internet. It is a game where you have a bidimensional cellular automaton where each site can be in one of four different states. You can change the state of one site and the interaction between states can trigger an avalanche effect. The aim is to trigger the biggest possible avalanche, although it is funny just to look at the dynamics and the metastable states to see how they look like.

My friend becomes excited with the game and said to me that this could have a lot of consequences even in sociology... well, physicists already though of this and he is right. I´ll edit this post another day and will try to put some links to show this.

Picture taken from: Milford Road.

Tuesday 15 November 2005

Kung Fu Science


I´m a kung fu fighter. After eigth years and two knee surgeries, last year I finally got my black belt. My style is Ton Long, or Praying Mantis, one of the several kung fu styles that exist. Most of the styles are inspired in the movements of animals, like Tiger (Hung Garr), Crane, Eagle´s Claws and Monkey, but there are others that do not follow the pattern, like Tai Chi Chuan, Drunk Style, Wing Chun (the style of Bruce Lee) or Suai Shiao. In fact, most of the styles are completely different martial arts and kung fu is a common name for all chinese martial arts. Kung fu is not even the correct name, its meaning is "hard work" and in China is used to every kind of art that needs a great effort to learn and master. The chinese name for their martial arts is wushu or kuoshu.

My passion for kung fu is well known among my friends and yesterday one of them send me a link about the physics of kung fu, a site entitled Kung Fu Science. The link is indeed about a study of the physics involved in breaking blocks with bare hands led by a young PhD student of atmosferic physics. The site has beautiful presentation and design and the text is very accessible for those who are not scientists too. There are links to related studies about the physics of other martial arts in the end of the webpage. It is worth to visit.

Picture taken from: International Chinese Kung Fu Association Website.

Sunday 13 November 2005

Write an instantaneous paper!

This is a program written by three guys from MIT: Jeremy Stribling, Max Krohn and Dan Aguayo. It generates random papers that seem real enough to fool someone who´s not a scientist. The incredible part, although, is that these guys submitted some of these papers to real conferences and were accepted!

I generated a paper with three other friends of mine. It even has some references to other papers with our names (although we never wrote them...).

You can generate your papers and read details and the whole story in their site:
SCIgen - An Automatic CS Paper Generator.

Saturday 12 November 2005

Physics & Artificial Intelligence


Last Thursday I finally earned my Ph.D. in Physics. The title of my work, together with my advisor Nestor Caticha, is "Learning on Hidden Markov Models", where we studied the performance of learning algorithms for HMMs, a kind of machine learning model that is a special case of a wider class named graphical models. Machine learning is an alternate name, although you can consider it as a particular area, of artificial intelligence. The diference between both terms is as diffuse as you want, but technically the former is preferred.

It may seems strange that a physics thesis is about machine learning, but statistical physics is an area with a lot of insterdisciplinar applications. It is an area that studies the interactions between systems composed of a large number of individual interacting units. It has already given a lot of important results when applied to the study of perceptrons, simplified models of artificial neural networks, and our hope in our work was that it could give interesting results for HMMs too. When I had prepared a suitable english version of my thesis and when our papers were submitted I will put links to them here.

But coming back to the main point, what exactly does physics have to do with machine learning? Well, one of the first insights of machine learning appeared when two guys, McCulloch and Pitts, introduced a simplified mathematical model of a neuron. The simplified model was inspired in the real neuron in the brain, it was composed of "synapses" from where the neuron received inputs in the form of numerical values, a "body", mathematically a function that processed the input turning it in an output numerical value that were transmitted to other unit by an output synapse. This model is in the paper "McCulloch, W. and Pitts, W. (1943). A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics, 7:115 - 133".

Linking a great number of these units together by the synapses, one could construct a network as complicated as we want. It can be shown that these networks can be used to store memories and to infer answers to questions. These artificial neural networks (ANNs) can have complex or simplified architectures, the simplest one being the so-called (Rosenblatt) perceptron. Well, our brain is a natural neural network with about 1011 neurons. It´s a huge number! So huge that for practical purposes we can treat this number as infinite. This is where physics enter, to be more specific, statistical physics.

Statistical physics studies systems with a great number of interacting individual units trying to make predictions of the typical behavior of the system as a whole. It makes the connection between Newtonian mechanics and Thermodynamics. In Newtonian mechanics the systems are described by position and velocity of each particle, but in thermodynamics a system is described by "bulk" macroscopic properties as temperature, pressure and volume. Thermodynamics is recovered from mechanics when we analyse the equations for a large number of particles and take averages, mathematically this limit, widely known as the Thermodynamical Limit, is attained only when the size of the system (the number of individual components) goes to infinity. This approach helped to understand interesting properties of matter, as the famous Phase Transitions (like water boiling or ice melting in water).

I turns out that there is a way to map neural networks into already studied physical systems such that when you apply the methods of statistical physics to then and take the thermodynamic limit, you can calculate properties! It was made for perceptrons and worked pretty well! Later, other methods of statistical physics were used with the same success, one of the most celebrated, and most controversial for most mathematicians, is a mathematical trick named the Replica Trick. But I´ve already written too much and will let those matters for another post.

Picture taken from:http://www.nada.kth.se/~asa/Game/BigIdeas/ai.html

Wednesday 9 November 2005

Quantum Gravity: CDT

Causal Dynamical Triangulations (CDT) is a very recent approach to quantum gravity. Like LQG it didn't use any new principles or symmetries, but tries to quantize gravity using the Feynman path-integral approach to quantum mechanics.

The Feynman path-integral in quantum mechanics is a formalism that allows us to calculate the probability of scattering processes in quantum field theory (QFT) in a simple way that can be associated with graphs to easier visualization and calculation. The idea is that there is a quantity named the propagator that can be calculated as the weighted sum of the possible ways that a particle has to going from a point to the other of spacetime. In classical physics, there are some paths that are forbidden, the ones where the particle needs to travel faster than light, but the idea of Feynman was that in quantum mechanics all paths are allowed, but the resulting path is calculated by interference of all the paths. All the scattering probabilities can be calculated using the propagator and this approach allowed the quantization of electromagnetic theory (QED). But gravity is a very trick force and resisted to first attempts to be quantized in this way.

In gravity, the analogous quantity to a path in spacetime is a path in the space of all possible geometries of the universe. But this summation is divergent, i.e., the result is infinite and a lot of work has been done to try to find a way to make this sum convergent. The CDT approach consists of approximating spacetime by a mesh of triangles (in this case, 4-dimensional triangles), make the summation and then calculate it in the limit where the size of the triangles goes to zero. In this continuous limit, it's expected that the resulting theory is well behaved. The first results show that it could be.

CDT was developed by Renate Loll and Jan Ambjorn and has the advantage that a lot of simulations can be done and some interesting results appeared. One of the most interesting results is the possibility of spacetime to have different dimensionalities in different scales. This appears to be a little strange, but technically what happens is that if you put a particle moving at random (technically, executing a random walk) in space, its behavior is similar to a particle walking in a 2-dimensional space for short times (what is interpreted as for short scales) and similar to a particle walking in a 4-dimensional space for longer times. Mathematically what happens is that the particle is performing a diffusion in a fractal space and we have a formula for this that gives the so-called spectral dimension for the diffusion. The spectral dimension can be calculated by fitting a curve to the graphic of how much the particle walked versus the time spent and inserting the result in the formula. That is what they did and found these results.

As I said, CDT is new and there are not so many people working on it in the world, but if you are interested, search for it and for papers published by Loll and Ambjorn in the arXiv. They always put a preprint of their work there. And take a look at the discussions in the "Strings, Branes and LGQ" section of Physics Forums. You can always learn a lot of things there.

Sunday 6 November 2005

Redshift: the Quantum Explanation


Last year I attended a course about cosmology given by Prof. Raul Abramo at the Physics Institute of the University of São Paulo. At some point, he was explaining the Hubble´s law and how we could calculate the Hubble constant by measuring the (cosmological) redshift of far objects. The explanation for the redshift is that, as the universe expands, it stretches the electromagnetic waves so that their wavelength is increased. A larger wavelength means a lower frequency and then the light is redshifted, or equivalently, the frequency of a light wave emitted from an object goes in the direction of the red light and beyond (to the infrared and more…) as its frequency is lower than the frequency of blue light.

After the class, I was thinking about the effect and realized that I was told a classical (not quantum) explanation. The nature of light seems to be quantum and then a full quantum explanation should exist. I first asked for a former professor of mine, Prof. Henrique Fleming, and he told me that there was no official explanation, because we haven´t a Quantum Gravity theory yet, and the redshift was a relativistic, therefore a gravitational effect.

I thought about the question and arrived at the conclusion that the explanation probably would be given by an interaction between the photon and the graviton. Somehow, the photon should interact with the graviton and give it some part of its energy. As energy is proportional to the frequency, less energy means less frequency and a redshift. Then, last week, I saw in Physics Forums a comment about a preprint by Michael Ivanov entitled “Low-energy quantum gravity” that presented the idea in detail. I did not read the paper yet, but it seems that it is an interesting one. Although nobody is sure that the graviton really exists (and if you go to Physics Forums you will see a lot of people that will say that it didn´t), maybe as a low-energy approximation the concept could explain the redshift effect in a quantum mechanical way, something that was not done till now.

Picture: Barred Spiral Galaxy NGC 1300 - NASA

Wednesday 2 November 2005

Charge and Time


There are two big problems in physics that all clues given by nature seem to indicate that they are related, but till now, no one is capable of explain exactly what is the relationship such that both could be solved. They are the arrow of time and the baryon asymmetry.

To be brief (and very unprecise), the arrow of time problem is the problem of why time goes on only in one direction since all equations of physics are symmetric with respect to the time variable. There are a lot of conjectures, but no one is really certain about that. The baryon asymmetry comes from the fact that we observe much more matter in our universe than antimatter, what is a problem for (electric) charge is conserved and if we suppose that in the beginning there was nothing and that everytime a matter particle is created its antiparticle is created together, we should have as much matter as antimatter in our universe.

Both problems are related by a symmetry of nature named CPT, that says that if we change the signs of the time, the charges and the parity in a physical system, all the equations of motion stay the same. Then, charge and time are related somehow. Generally, as Feynman pointed out, we can consider FORMALLY that an antiparticle is a particle going backwards in time.

Now, to the crazy idea I´ve been thinking about. I stress this point: IT´S JUST AN IDEA, I have to work over this to see if it has some chance to live or if it´s just nonsense. Maybe the reason why we see more matter is the same why time only goes in one direction. Somehow, particles and antiparticles may have an internally defined direction of time and as time goes on forwards, we see much more matter. Seems to me like a symmetry breaking induced by a field or a fluctuation. Anyway, I need to work more...

Edition (04-Nov-2005): I was reading around and found that there are ideas related, so that´s not so asurd at all.

Picture: A burst of light is emitted as the electron and its antiparticle, the positron, collide. (Image credit) NASA - Goddard Space Flight Center Scientific Visualization Studio

Sunday 30 October 2005

Selected Week News #3

The Man Who Would Murder Death
By Thomas Bartlett
From The Chronicle of Higher Education


The article tells about Dr. Aubrey de Grey, who´s a former Computer Scientist who turned to research on aging and has, to be polite, a lot of unusual ideas about preventing death by becoming old. Sounds a little like science fiction, but biology is evolving very quick now and this century will bring as much revolutions as we saw on physics and information technology last century. It´s interesting to be aware.



Wilma the Capacitor
By Paul Noel and Mary-Sue Haliburton
From Open Source Energy Network


This article suggests that hurricanes can act as natural particle accelerators and have capacitor properties that explain why they simply do not disperse. I remember that I learned that vortices are very stable fluid configurations per se, but a hurricane has a lot of interactions with the environment and I don´t know how all this interactions interfere in the stability.



Walking Small: The First Bipedal Molecule
By Ker Than
From LiveScience


Almost everyday I see some article about nanodevices. This is a very interesting one talking about a molecule that has a shape that let it walk as if it was a bipedal bug. All this little machines will play an important role mainly in the medicine of this century, but also in a lot of other fields like material engineering and chemistry. Just wait and see.

Wednesday 19 October 2005

Quantum Gravity: LQG


Loop quantum gravity, or LQG for short, is one of the most popular approaches to quantum gravity. It is in the second place right behind string theory. It is a theory that try to quantize gravity using just plain quantum mechanics as we already know it, without incorporating any other new principle. Compared to strings, a very humble theory.

The trick to make this approach to work is to describe general relativity using a set of new variables, called Ashtekar variables, and impose the quantum commutation relations for these variables instead of for position and momentum as is usual. These commutation relations are mathematical relations that in non-relativistic quantum mechanics are responsible for the uncertainty principle, that says that if a particle has a well-defined position at an instant it has no defined momentum and vice-versa. When you add relativity things become a little more complicated, but the spirit remains the same. Indeed, this method of quantizing a theory by imposing quantum commutation relations IS what we call to quantize a theory. Quantum mechanics is not a very understood theory. It works, but we don't know exactly what happens. What we do know is that if we expand the solution of classical equations in Fourier series and impose the quantum commutation relations for some variables, it works. That is how QED was quantized, and it worked with an astounding precision.

The approach is called LOOP quantum gravity because the variables to be quantized are variables known as Wilson loops. Formally, a Wilson loop is the trace (i.e., the sum of the diagonal components of a matrix) of the holonomy of a vector transported along some closed path (a loop) in spacetime. Holonomy is an operator that gives the resulting vector after the transport has been made. If you do this transport in a flat spacetime (with no gravitational fields), the resulting vector is the same as the initial vector. But if the spacetime is curved (like the surface of a sphere), the result is not the same vector.

Using this variables, physicists were able to find solutions for the resulting quantum equations for general relativity (GR). Some additional results have been found, for example, they found quanta of area and volume for the spacetime (something that physicists liked, because we think that spacetime is fundamentally not continuous, but discrete) and, like string theorists, the correct entropy formula for black holes (in special cases). LQG is far more simple than string theory, although this simplicity is relative once that the geometry involved in LQG is very sophisticated, and to date achieved as much successes as strings (theoretically, because experimentally, they're in the same level: no confirmation at all). There are some physicists that even believe that strings and LQG can be combined in a single theory, because both have interesting physics insights and results and some similarities. As the popularity of strings id coming down due to the lack of testable predictions, that of LQG is coming up. The only thing that rests for us physicists is keep working on the problem to see what Nature could reveal to us in the future.

Picture from the article: "Quantum gravity: The quantum of area?" - John Baez - Nature 421, 702-703 (13 February 2003). Original caption:

In loop quantum gravity, space is envisaged as a fabric of woven threads. Where these threads puncture a surface, such as the event horizon of a black hole, they define its area.

Sunday 16 October 2005

Selected Week News #2


Online Game Could Boost You into Space
By Leonard David
From Space.com

If you like to play, and if you're good, you can try to win this trip. The official page of the contest is www.space-shot.com


Hurricane Center Has One Name Left: Wilma
By Robert Roy Britt
From LiveScience


The interesting part of this article is the explanation of how hurricanes are named. I didn't know that as in my country we don't have hurricanes...


New “hobbit” bones bolster separate species claim
By Andy Coghlan
From New Scientist


It is one more species of man that has appeared during evolution. Quite interesting because it gives some reality to old legends.(And show that the scientists that named it liked the "Lord of the Rings".)


Duped and Clueless: How Easily We Fool Ourselves
By Ker Than
From LiveScience


This is a new experience showing once more a fact that science already observed before: our memories are constructed by our brains and can differ from reality. The brain can even construct the memories incorporating elements that are totally fictitious as if they were real. It's scaring to know that what you remember could never have happened, isn't it?

The Rise of the Body Bots
By Erico Guizzo and Harry Goldstein
From IEEE Spectrum Online


A detailed article about the latest developments in the area of exoskeleton building. You can see that probably very soon they will be avaiable for a variety of practical applications.

Friday 14 October 2005

Particle Physics Art


My brain works visually. That's why I prefer geometry to algebra (okay, I like both, but I like geometry more). And that's why I like to put beautiful (at least to me) pictures in the beginning of every post.

Sometimes in physics theories reach phenomena that our eyes cannot. Then we need to rely only on math and have no clear visual picture of what is going on. But I suspect that a lot of other scientists also like visual inputs and that is the reason Feynman graphs and other diagrammatic methods became so popular in physics: they not only give us a better way to calculate things, but they give us some visual picture of the process and we can feel more confortabe with what we are doing. Somehow, we feel that we are understanding it better.

Particle physics is an area where pictures are always welcome, because we can only experiment particles indirectly, by means of paths in accelerator's collisions. Rigorously, quantum mechanics' math treats particles as point structureless entities. But although they're points and has no structure, they have a lot of properties like spin, momentum, polarization, mass and other quantities associated, most of them with the collective name of quantum numbers. The best we can say is that not having images in our brains to visualize those things is boring. Then I saw yesterday an article entitled "gallery: jan-henrik andersen" from where I took the picture in the top of this post (that represents a photon) talking about this designer that worked together with particle physicists to create graphical representations of the particles that would reflect their properties. The article has a lot of beatiful images and you can download a PDF file, althoug the resolution of the pictures in this file is not so good as in the site. I really enjoyed his work.

Just one more observation: these are artistic REPRESENTATIONS of the particles, remember, as I already said, that our view of particles today (at least the elementary particles) is that they are just points.

Thursday 13 October 2005

Busted Archimedes




In the beginning, I used to like Mythbusters, but as the episodes unfolded I started to be suspicious of their results and then I finally realized that they do not test things with the required care. I stopped seeing then, because I saw that you cannot trust entirely in their results and, sometimes, the results were so obviously flawed that made me upset. One of these days was when they tested the story about Archimedes against the Roman fleet.

The story says that Archimedes devised a way to combat the Roman warships trying to invade Syracuse by focusing the sunlight on them with soldier's shields arranged in some ordered way. Mythbusters tested the story in a highly controversial way and concluded that it was false. Well, I didn't like the way they made the test and commented with my fiancee that they probably would be wrong, because the test was full of misconceptions and strange asumptions. Well, today I was browsing slashdot and found that some guys from MIT made the test more carefully and indeed found evidence in favor of the story. At least, it is not totally discarded as the TV guys said.

Mythbusters have good intentions, but they must be more careful if they want to get more credit. My advice is ALWAYS to be suspicious about what they say, sometimes it's right, but sometimes it's not. I would give this advice to them, but I guess they'll probably not listen to me...

(The picture is: Archimedes Death Ray. Wall painting, Florence, Italy)

Monday 10 October 2005

Quantum Gravity: Strings


String theory is the most known approach to quantum gravity. It started as a tentative to describe the strong interaction in the 60's, but soon after QCD was discovered and shown to be the correct approach. Strings went forgotten for some time and were rediscovered by Green and Schwarz in the 80's. One of the main flaws of strings in the description of the strong force turned out to be what called the attention to it as a possible theory to describe QG: in the spectrum of the theory you always had a spin-2 massless boson that was undesirable in the description of strong interactions. But, if a graviton exists, it should be exactly a spin-2 massless boson, and so the idea that strings could describe QG was born. And gravitons are not the only particle that appears in the spectrum of the theory, other particles with lower spin appear too and so strings were believed to describe not only gravity, but all other interactions.

Okay, but what does the theory say? First, you must keep in mind that string theory is just a tentative theory, and not very successful yet. The theory appears to be consistent and appears to give general relativity and quantum mechanics in the correct limits, but it is not rigorously proved yet. Worse, the theory cannot predict anything testable yet, although it can be tested in principle (if not, it would not be a scientific theory). Well, let me give an idea of the picture of the universe painted by strings. String theory has a simple underlying idea that changes everything. In ordinary quantum mechanics, elementary particles are considered points in space, i.e., they are 0-dimensional. In string theory, they're supposed to be tiny strings, little 1-dimensional objects. Note that strings are not made of something, in fact, they're the fundamental stuff that makes everything.

Okay, you have 1-dimensional objects, but now you need to know how they move. As string theory must agree with relativity in the correct limit (the limit where the string seems to be a point), the movement of the string is supposed to behave in an analogous way as the particle in relativity. The particle in relativity traces a curve in space that is a geodesic, i.e., it is the short path from one point to the other with the distance given by some metric defined by the mass-energy distribution in the space. But as the string is 1-dimensional, instead of a minimal path we define that the string will move such that it traces a minimal area from one position to the other. This is the fundamental principle of strings. After defining this, you can do some calculations. But this description is not a quantum description yet and then you need to apply some mathematical rules that characterize quantum systems and you get what people call the Bosonic Strings, because this kind of string gives only bosons during the calculations.

Well, the universe is not only composed by bosons, which are the particles responsible by the interactions (strong, weak, EM and gravity), but by fermions too. Fermions are the particles that compose matter. Quarks and leptons (which include electrons). You include fermions in string theory by adding a kind of symmetry in Nature named supersymmetry. It is a principle that says that to every boson there exists a corresponding fermion. This hypothesis is a fundamental component of string theory, but it has not been proved yet in experiments. If it turn out to be wrong, string theory is probably wrong too.

The last curious feature of strings is the hypothesis that the universe has not 4 dimensions, but more. This comes from a mathematical problem that requires that, in order to string theory to be well defined, you need that the number of dimensions in the universe must be a specific number: 10. The problem is that we only experiment 4 dimensions in our daily lives and string theorists had to adapt and old trick first developed by Theodor Kaluza and Oskar Klein in the past to reduce the number of dimensions perceived by us, a theory that appropriately has the name of Kaluza-Klein Theory. The best way we devised till now is by means of a mathematical process where the extra-dimensions are wrapped in a geometrical cosntruct named a Calabi-Yau manifold, which is represented in the picture in the beginning of this post. This hypothesis has not been tested too. That is because our technology cannot probe the distances necessary to do the tests, but soon it will be possible. If we cannot find these extra dimensions, again string theory will be wrong.

String theory is ambitious. The idea is to describe all interactions in a unified framework. This ambition has a drawback: the theory is exceedingly complicated and to this date you cannot calculate anything numerically to compare with experiments. The predictions of extra dimensions and supersymmetry are important to strings, but even if they're found, they will not be sufficient to prove the theory for you can have other theories with these principles. A recent result of strings was the calculation of the entropy of a special kind of black hole with the correct factor given by the Bekenstein-Hawking formula. But as black holes are not experimental facts yet, it is just a marginal success.

Theoretical developments of strings lead in the past years to a myriad of new possibilities and mathematical techniques. Now it is believed that strings are part of a much more (to this date undefined) complex theory called M-Theory. Strings are not the only component of the theory now, but you have objects of any dimensionality called branes. But as the time passes and predictions and experimental observations do not happen, the scientific community is becoming more and more suspicious of the correctness of string theory. This has led to new theories of quantum gravity alternatives to strings. If they're related to it somehow, nobody knows. In the end, the answer is always with the only one who knows the correct laws: Nature.

Saturday 8 October 2005

Selected Week News #1


Mystery Ocean Glow Confirmed in Satellite Photos
By Robert Roy Britt
From LiveScience

For the first time a strange glow in the ocean already described by sailors is photographed by satellite. The only explanation that was though till now is that it should be luminescent bacteria (that emmit light by a process called, a little obviously, bioluminescence), but more studies are needed to find the truth. It is interesting to think in how many natural phenomena we have yet to study in our own planet. I will not be surprised if more ghost and aliens sightings would be explained by natural phenomena we don´t understand yet.


Python Eats Gator, Explodes
By Denise Kalette
From LiveScience


Poor snake! That´s what happens when men break the natural balance by introducing alien species in different habitats. But in the end, I think that the world is getting so connected that it will be very difficult to maintain isolated natural habitats. Sad, because the number of species will diminish drastically and some substances that would be discovered among these organisms will probably take too much time to be synthetized by us.


Micro-organisms may be turned into nano-circuitry
By Will Knight
From New Scientist


The picture is of diatoms, a kind of microorganism with dimensions of the order of nanometers that some scientists think they could use to construct circuits.

IgNobel 2005
The IgNobel Prize is back and the laureates were already announced this year. If you don´t know what it is, it´s a prize given to the most weird reasearchs and achievements of the year. Sometimes these guys are a little unfair, but surely always fun.

Thursday 6 October 2005

Reflections about the Bomb


Yesterday I saw a documentary about the Manhattan Project (MP) on the History Channel. I always though what would be if some government put together as many brilliant scientists and spent as much money as in this project in a similar project with an objective that would benefit mankind. Suppose that you bring together the most brilliant minds in the world in the area of Medicine and give them as much resources as the USA spent to build the atomic bomb for them to find the cure of AIDS. I bet that they would find a solution in a couple of years or less!!!

I´m a physicist, so I´ll talk about physicists. Work together with the scientists of MP is the dream of every physicist: they were a group that could do almost everything! Fermi, Bethe, Bohr, Feynman, Oppenheimer and a lot of other great physicists were there. If the government asked them to create a teleporting machine, they would have did it! They would discover everything if it was not impossible, and if it was, they would get as close as possible. But sadly this kind of project only is created to destroy, not to construct.

Most of the physicists at MP didn´t like when the bomb was deployed among civilians, although some were people that lost so much in the war that were driven by hate and didn´t realize the atrocity that was being made (e.g., Edward Teller). The documentary told that the scientists of MP signed a document asking for the bomb not being used in the war, just tested in an isolated place. A little naïve, I admit, but sometimes we scientists are very naïve.

Well, I was researching a little about the project and the bomb and found this interesting letter from Einstein to the president Roosevelt. This was the letter that initiated MP. Einstein was afraid that the Nazis would build the bomb before any other nation and drop it, so he warned USA about the danger. Einstein didn´t work in MP, but he was as naïve as all the others thinking that the USA would never use the bomb, even after constructing it.

I would like very much to see one day projects like MP to find the cure of AIDS, of cancer, of Alzheimer, to build teleporting machines, spaceships and other things. I just fear that somehow the governments will distort it and use it to make war and control other nations. Sad.

Tuesday 4 October 2005

Nobel of Physics 2005




The Nobel Prize of Physics of 2005 was given today to two american scientists, John Hall and Theodor Hänsch, and to one german, Roy Glauber. They share the prize for their works on Quantum Optics.

Hall and Hänsch won the prize for works in quantum optics that advanced the field of spectroscopy and enhanced the precision of spectrometers. Spectroscopy is the set of techniques used to analyze the light emmitted by atoms or molecules. Each atom or molecule absorbs light in a particular way and emmits this energy back to the environment in a specific pattern that defining a signature. This is how astronomers can tell you the elements in the composition of a distant star: they analyze its light and identify the pattern of different elements. This pattern also revealed that the electrons in atoms were placed in specific orbits ordered by integer numbers, what Bohr explained in his model for the atom. But this is another story.

Glauber is a guy that I only lnew from a classic work in statistical physics where he introduced what is called today as "Glauber Dynamics". It is a way to introduce and study a dynamics in the Ising model (someday I'll talk more about it), which is originally a static model intended to describe magnetic materials. But he won the prize working on quantum optics too and it seems that he somehow is the father of this area, he was the first to apply quantum mechanics to describe optics.

I found an interesting document explaining the contributions of these three physicists in the homepage of Hänsch. I think I could hardly explain that better. I copied the document and put it HERE.

You can find more information on the Nobel Prize Page itself.

The image was taken from the site of the Vienna University of Technology where it has the caption:

The purple light originates from helium atoms excited by intense laser light. The laser pulses propagate along the axis of the purple lobes (horizontally) through the helium gas, and the X-ray beam (not visible) is radiated in a beam several hundred micrometers in diameter in the same direction. Photo by courtesy of: J. Seres, Vienna University of Technology.

Sunday 2 October 2005

A Sign for Half the Universe (?)


When I was leaving my room in the university last Monday, a friend of mine was unease with some problem and then I asked him if the problem was a minus sign I´ve heard he talking about with another friend. He said to me that the problem was deeper, although sometimes a minus sign could be a deep problem. I answered saying: "It was for Dirac, wasn´t?".

I intended to refer to the fact that Dirac turned an undesirable minus sign in the solution of an equation in a great success. Let me tell the whole story. Paul Adrien Maurice Dirac was a British physicist that was working on a quantum relativistic wave equation for the electron and found one which has as solutions for the energy of the electron two values, one positive and one negative. Usually, the procedure would be to discard the negative energy solutions and keep going on with the positive ones. But Dirac noted that he could interpret the negative energy solutions as a new particle that had almost all the characteristics equal to the electron, but with the opposite charge. He called the particle "positron" (for positive-electron or positive particle or whatever…) and it was found experimentally some time later. He just predicted antimatter!

I had a professor called Henrique Fleming that used to say that Dirac discovered half of the universe. That was because matter and antimatter, apart from their opposite electric charge, seems to have the same characteristics and there is no reason why the universe should have more matter than antimatter. Okay, there is a deeper reason for supposing the equality. It´s because in processes that create particles with mass from ones that has no mass, matter and antimatter are created in the same proportion, such that the electric charge is conserved in the whole process. It is indeed a great mystery for physicists why in the observed universe matter dominates antimatter. There are some speculations, but just it.

Dirac discovered a lot of other things. He was one of the greatest physicists of the 20th century and probably you will heard more about him in the future posts. Antimatter is another interesting subject too and I´m planning to talk more about that later. So, as you can see, a little minus sign in an equation could lead to a big difference. I suggest you to ever remake your calculations, who knows what secret may be lurking behind a “-“.

Wednesday 28 September 2005

Stereograms & Lena


After posting that stereogram of a dinosaur I took a look at Google to see if I could find other interesting pictures. I found an interesting site (okay, that´s the first to appear in my search) named simply Stereogram Page with some useful information and stereograms. Although I suggest to look more pages if you became really interested in the matter, I found in the site an interesting picture (the one at the beginning of the post).

This picture is not interesting because of the beauty of the lady in the photo, but because that girl is Lena. I met Lena for the first time a couple of weeks ago in the room of another physicist. While he was talking to me about a work he´s doing on image treatment, he showed me a photo of a girl (just the face) and asked me if I knew her. I said I didn´t and he told me who she was and a little story.

The photo was one of Lena and he told me that the picture is extremely popular in the scientific community of image treatment but people didn´t know who was Lena untill someone decided to look for the first paper where the image appeared and asked the authors where they found the photo. They said they scanned it from an old Playboy magazine. These guys then tried to find her and they succeeded: she was already a sweet old grandmother. So, they took her to a congress on image treatment and she was extremely applauded. He told me the story and then showed me the original photo, full body. It´s sad I don´t have enough space to put it here... :o)

Sunday 25 September 2005

Quantum Gravity


There are four known fundamental interactions in our universe: strong, weak, electromagnetic and gravitational. When quantum mechanics was found to be the correct framework to formulate our physical theories, a program to describe these four interaction in a quantum way started. Successful quantum descriptions of the first three (strong, weak and EM) were found. EM was the most simple to be quantized, since Maxwell wrote his equations in such a general way that they didn´t need to be modified (they even didn´t need to be modified to adapt to relativity too! I´ll write a post about this in the future, if I remember…). The resulting theory is known as QED, or Quantum Electrodynamics and is largely confirmed by experiments. The quantum theory of strong interactions is known as QCD, or Quantum Chromodynamics and is also confirmed by experiments. The weak interaction is described by Electroweak Theory, that has the additional bonus of unifying the weak force and EM interactions in a single framework. Together, these three quantum theories form theoretical framework named the Standard Model that explain all the known microscopic physics that our today technology has access without being contradicted by any experiment.

This seems to be very nice, but there is a hole in all of this: gravity. In the microscopic world, gravity is so weak that we can ignore gravitational interactions and the results of experiments will not change in a perceptible way. But we have strong arguments supporting that we must quantize gravity in the same way as we did to the other forces. We tried it, but we failed miserably. Technically, if we try to quantize gravity using the same techniques that succeeded in quantizing the other three forces, we discover that gravity is nonrenormalizable. This is a technical word that means that in our calculations we find a lot of infinities that we cannot make disappear and so, we cannot calculate things with our theory and cannot make predictions. We are lost!

Be calm, not everything is lost. Our failure only means that gravity is more complex than we though and we need to be a little smarter to find the correct theory. This is good, because we have a real tough challenge and we physicists like challenges! Well, let´s get back to gravity. Since our simplest tentative didn´t work, we had to try other approaches. The first and most widely known by the general public approach which gave us some hope of finding a quantum gravity (QG) theory was String Theory, but its complexity reached such enormous proportions and after decades it didn´t yet provided a correct testable QG theory. Strings are an ambitious theory, because it not only tries to quantize gravity, it tries to unify all the four forces of Nature in a single unified theory, as people usually call it, a theory of everything (TOE).

But in the last two decades, a lot of theories alternative to strings appeared. The most popular now is Loop Quantum Gravity (LQG). These theories are not so ambitious as strings, because they only try to quantize gravity, not to unify all forces. Other approaches include: causal sets, causal dynamical triangulations (CDT), twistors, spin foam models and others. Remember that these are TENTATIVE theories and none was tested yet.

This is the big problem of QG in physics today. It is one of the most challenging problems of physics and a very active area of research. With this post, I´m starting a series of posts that will explain the existing approaches to QG with more detail. I hope that you enjoy this journey. But be patient, it is long and tortuous. Prepare yourself but remember that the most important is ALWAYS to have fun.

Saturday 24 September 2005

Prove me Wrong!


One of the most fundamental features of a truly scientific theory is that it must be falsifiable in principle. By falsifiability I mean the characteristic of a theory that enables it to be tested by means of experimentation. A theory that cannot be falsified by an experiment is just metaphysical or philosophical. It is beyond the scope of science. For example, if I say that people only die when the time comes, it cannot be disproved in principle. That is because there is no experiment or situation where this theory can be shown to be wrong: if someone dies, the time arrived, if doesn´t, the time hasn´t arrived.

When you find someone defending a theory by defying you to prove that the theory is wrong and saying that if the theory cannot be proved wrong it must be right, be sure that the person is not a scientist or is a bad scientist. Theories that cannot be proved wrong are not good, they´re useless. A theory that cannot be falsified has no predictive power, i.e., you cannot predict the outcome of new experiments, cause if you could, the theory could be tested just by doing that particular experiment and verifying if the result is the same as the prediction. The more predictions of a theory are confirmed, the more the theory is trusted, but rigorously you can never say that a theory is completely correct. I´ll post something later showing how this relates to Bayesian inference.

It seems too simple, but this simple requisite is the main reason why a god cannot enter in science. That´s because gods cannot be falsified in principle. There is no experiment or situation that prove that they don´t exist. The same way, any theory that include gods are not scientific too (e.g., intelligent design).

So, adding to the items you must check to judge a theory, try to find a way to test it. If it cannot be tested in principle, if the theory cannot make a prediction but just fit the already known experiments, be very suspicious of it: probably it is not worth the time you will spend trying to understand it.

Thursday 22 September 2005

Why not to lie?


Do not expect a text about moral principles. And I will not say that to lie is a sin, because I don't believe in "sins". Talking always the truth is not good because it will take you to heaven, but there are a lot of rational reasons to do that and I'll try to enumerate some.

We are social beings. We interact with other individuals. Very few of us live in total isolation. Therefore, our life depends on our relationship with other persons. Based on this, the first reason to tell always the truth is that people will always trust you. It's a great advantage! Sometime someone will tell a lie that will harm you saying that you did something you didn't. If everybody knows you only speak the truth, you'll be safe. Your word will be enough. But if you tell lies, then people may not trust you even if you are innocent. There is an additional bonus: most of people will tell you always the truth too. People will feel compelled to pay back your favor of telling them the truth by doing the same thing. It's highly probable that this will happen. People will even tell you secrets. Just because they know that if you promise so, you won't tell the secret to anybody else. And there is always someone who needs to share a secret with other person.

Another reason is that it will make you always try to do the right thing. And you'll probably benefit of this. That's because if you do something very wrong, you will have to lie to hide that. Therefore, you will try to do everything right. If you don't, you'll need to tell the truth and something bad can come of this.

Now, consider what happens when you tell lies. People get angry with you if they know. People will not trust you and this can harm you. It´s difficult to make friends if they don´t trust you. What happens is that you will end up with no friends, probably with a lot of enemies. Sometimes people gain with a lie. But in the average, the gain will bring more problems than good things. Unless you prefer a risky and lonely life, this is not the best thing to do.

Even the lightest lie can make damage. There is no little harmless lie. You're never doing any good for someone if you lie, although sometimes you may think so. Suppose people are worshipping some idol because they think that idol can do miracles. The community is happy because that renews their faith. You discover that the miracles are fake. If you don´t tell them to keep their faith, you open the doors to someone who can use that faith for own benefit. And trust me, there is always someone eager for doing this.

Once I even had a talk with another physicist that told me that maybe you can derive moral reasons from a rational model. Statistical physics is an area of physics that studies the behavior of interacting systems with a lot of components. Exactly like every society. This is interesting, but I´d better talk about this in another post…

Picture: Pinocchio - Jim Salvati

Monday 19 September 2005

Beware of Revolutionary Theories!


I was reading some posts in Physics Forums and found something you see a lot of times in the Internet. Physics Forums have the advantage that a lot of serious physicists and mathematicians participate in the discussions and, sooner or later, things like that disappear, although sometimes people could be really insistent.

What I'm talking about is "Revolutionary New Theories". People saying that they found a theory explaining all science in a revolutionary way. Sadly, the scientists never believe in the theory and the author feels marginalized. Then, he starts to write a web page so he can publish his "discoveries".

It is not too hard to a experienced scientist to identify these kind of false theories. I will not explain that in detail, I will just direct you to the sites below where you can find almost everything you need to know about that:

Crackpot Index - by John Baez

Are you a quack? - by Warren Siegel

Read it and take your own conclusions. What I intend to do in this post is just to give two little and quick tips to identify these kind of false stuff.

1. How many formulas can you find in the work? There is NO WAY to do science without math. Sorry, but that's true. If the work claims to be a professional work (for experts) and have almost no math, it is probably just waste of time. Note that when I say "professional" I'm excluding books and articles to divulgate science to non-professionals. I'm talking about works to be read and analyzed by scientists.

2. Take the name of the "theory" and of the author and try to find any paper related in arXiv. Today, almost every physicist and mathematician post there a preprint of their work. It is not a reviewed journal, but if the theory is not even there... something is wrong.

Remember that the above tips are just TWO little features you need to pay attention. The sites I linked above have a lot of more things you should check.

Be careful, Internet is good for acquiring knowledge, but it includes wrong knowledge too.

Saturday 17 September 2005

Poor Evolution...



Evolution, like quantum mechanics and relativity, is one of the most misunderstood body of knowledge of our science by those who don´t study it. It is not just a scientific hypothesis. The evidence in favor of it is tantalizing and there is NOT any scientific alternative to it. Evolution made predictions that have been and continue to be verified day after day. It is genuine science, but people generally do not understand well their contents and this is the cause of lots of wrong press reports and affirmations about it. I will not talk here about the absurdity of those who believe that evolution is wrong and try to argue that intelligent design, that is not a scientific theory but just plain religion and misticism masked as PSEUDOscience, is valid. This is a meaningless discussion for me. I´ll talk about incorrect interpretation of the rules of evolution.

Sometimes you hear someone say that in the future all our fur and hair will desappear as a result of our evolution. This is wrong! Some characteristic only changes in a species when you have some selection pressure to change it. There is no advantage in the modern world of being bald or having no fur. So, what happens, is that probably the distribution of "furness" in our species will tend to a gaussian distribution, where a few individuals will have a lot of fur, a few will have no fur and the majority will have moderate fur.

About our teeth. They´re not disappearing as a result of evolution. Our maxilar today is not so large to accomodate all our teeth because we do not exercise it sufficiently so it can develope. We eat "soft food" today. Everything we eat is processed and we simply do not allow our jaws to reach the total size they would have if we eat, let us say, raw meat and raw vegetables. It is not evolution, it is convenience. There is no selection pressure in action that would force the number of our teeth to decrease.

On the other side, our intelligence is improving. Okay, this may be true (note: MAY BE!). That´s because the smartest people have more chance to survive and procriate than the others. Although, we know that today the more developed the country, the less the natality rate.

The key feature of evolution is natural selection. Changes only stick to a species if they give an advantage that allows the individuals having that feature to outfit the others and reproduce more, or survive where the others cannot. No selection pressure, no evolution. This is, indeed, a prediction that can be tested and everytime it is tested, it confirms evolution.