A Posteriori

Attempts to grapple with and elucidate empirical knowledge

New Home June 30, 2012

Filed under: Uncategorized — Rāhul @ 14:08

This blog is moving to my new home at- http://www.pvrahul.com/blog/. I hope you will follow me there, although most of my posts there will be quite different from what I have here.

 

Olympic medals and their rankings March 3, 2010

Filed under: Uncategorized — Rāhul @ 01:11

As the winter olympics in Vancouver drew to a close last Sunday, the media went through the usual exercise of tabulating national medal tallies. Although the International Olympic committee doesn’t recognise any of these arrangements, they make for interesting analysis. While some of the medal tallies are arranged in descending order of total medals won, some others are  ordered based on the number of gold medals won, with silver and bronze medals used only to break ties. The latter scheme apparently enjoys wider appeal and is used in the wikipedia entries of the medal tables too but the total medal count is favoured by most American newspapers. Each scheme has advantages and disadvantages.

Ranking performances based on total medal count obviously treats all medals the same and hence fails to appropriately credit a Gold medal over silver or bronze. Also, this scheme falls prey to one handicap of the system where the third best performance in a discipline is awarded a medal alsongside the first and second while the fourth is not. So, a country which had one 3rd place performance is ranked above another with many 4th place performances.

Rankings based primarily  on the number of Gold medals won and using silver and bronze medals progressively to split ties, while failing to allay the injustice to the 4th place completely is at least not as glaring. Any number of third places now don’t make up for a second or first place. But, it instinctively seems unfair to rank 1 gold medal over 10 silver medals either.

A fair system of assigning weights ot Gold, Silver and Bronze medals is required to compare them against each other in compiling the final medals table. Wikipedia lists such  schemes that have been previously used by newspapers like 5:2:1, 3:2:1 and 4:2:1 as the ratio for Gold, Silver and Bronze medal weights respectively. Perhaps a poll among past olympians in a field can be used to set the right ratio for olympic medals in that field. It seems to be a matter of opinion but that shouldn’t stop us from exploring the possibilities!

Applying these schemes to some of the top nations in the 2008 olympics, we have the following rank list-

Gold First Total 05:02:01 03:02:01 04:02:01
China United states China China China
United States China United States United States United States
Russia Russia Russia Russia Russia
Great Britain Great Britain Great Britain Great Britain Great Britain
Germany Australia Australia Australia Australia
Australia Germany Germany Germany Germany
South Korea France South Korea France South Korea
France South Korea France South Korea France

The three weighted schemes used are obviously attempts at compromising the colour-blind medals list and gold first list. Considering this list, it isn’t surprising why the American media insisted on using the total medals are their ranking criterion!

 

Science: Not just a vehicle for technology January 10, 2010

Filed under: General Physics,Public Policy — Rāhul @ 00:14
Tags:

Modern life is linked inextricably to Science and Technology. Those two words, although very different in origin and meaning are so intertwined today that their history makes interesting reading. The Merriam Webster online dictionary defines Technology as the practical application of knowledge and Science as a system of knowledge covering general truths or laws that is capable of making falsifiable predictions. This definition suggests that Science creates knowledge which then propagates into Technology that is used to enrich our lives. But the relationship between the two is not always so clearly unidirectional.

2009 was celebrated worldwide as the International year of Astronomy because it was the 400th anniversary of the great Italian physicist Galileo Galilei setting about building his “spyglass” which he soon improved to discover the satellites of Jupiter, sun-spots and the phases of Venus. Galileo did all this without a scientific understanding of the propagation of light which had to wait for Isaac Newton. Instead, the technology of the telescope advanced by continued experimentation, enough for Galileo and others to gather sufficient empirical evidence for a then fledgling theory, that the earth is not the centre of the universe.

Against popular belief of the time, Nikolaus Copernicus in the 16th century had proposed that it is the earth that revolves around the sun and not vice-versa. However, it wasn’t until the preponderance of astronomical evidence gathered by telescopic observations a century later that Copernicus could be proved right. The technology of the telescope completed the most important step in the elevation of Copernicus’ proposal to the level of scientific theory. An overwhelming amount of data was gathered, all of which supported Copernicus’ idea over the model placing the earth at the center of the universe, thus convincing all rational sceptics of its merits. In this case, the advance of technology allowed us to see farther and deeper into nature’s mysteries, thus revealing scientific facts.

On the other hand, modern day technologies are inextricably linked to scientific advances. Quantum theory led to our understanding of semiconductor electronics without which the computer industry wouldn’t have taken off. General relativity allowed us to understand gravity well enough to build spacecrafts that put a human being on the moon. Enhanced understanding of the human body and the germ theory of disease led to the design of cures to many infectious diseases. Science and technology, it seems, are advancing together feeding off the achievements in each other, like a system with positive feedbacks. Improved technology allows us to probe further into phenomena that perplex us and lead to scientific theories that help design still better technologies that add value to life. But this relationship between science and technology was not always so close.

Technological advances have been a hallmark of human civilisation throughout history. Our ancestors controlled fire, learned agriculture, invented the wheel and used natural medicines, all by empirical studies that established their utility without any real understanding of the underlying principles. Despite this challenge, technology made tremendous advances, the importance of which is underscored by the fact that historians use technological strides to define particular ages of human history.

Modern Science also had a precursor in history. All ancient civilisations developed natural philosophy to explain the mysteries that surrounded them. While technologies added comfort to life, philosophical inquiry addressed the relentless questions of the mind. But these endeavours did not mesh effectively together to feed off the advances of each other like modern science and technology do. For instance, some schools of Indian philosophy postulated the atomic theory of matter long before it became a scientific theory based on empirical evidence. But the former cannot be called science because it was not based on experiments. The ancient Chinese on the other hand, made practical use of the observation that magnets always tend to align along the same direction, but they did not attempt to explain it using fundamental principles like we do now.

It was only post-renaissance that modern science, as defined by the scientific method, was born. Natural philosophy began to be buttressed by structured falsifiable experiments. Technologies increasingly made use of scientific advances and contributed to them too. This process of co-mingled development has led today to a situation where we cannot imagine excelling in the pursuit of either without also excelling in the other. But what are the consequences of this blurring of differences?

It is easy to see a causal relationship between technology and tangible benefits to society. In a capitalist economy, technological advances can be easily commercialised and the inventors rewarded handsomely. So there is tremendous societal interest in incubating and facilitating technological endeavours. But, science, on the contrary, is more of a personal pursuit. Although it leads  to technologies, its major purpose is to satisfy our innate curiosity and thirst for knowledge. While this is as, if not more, important than material progress, it is difficult to make the case for a result-oriented society to support science for its own sake, purely for the joy of exploration.

Hence, scientists in modern times have tended to use the interlinkages between science and technology and how advances in the former translate into technological marvels in attempts to win more societal support for science. While there is nothing wrong with the reasoning and it has been successful in increasing science funding, the question has to be asked if this is the right approach in the long run. By restricting the utility of science to the narrow channel of technological progress, we risk de-legitimising, in the eyes of society at large, the science that searches for answers to our basic questions.

Space exploration provides one of the best examples for this malaise. Although human beings have always yearned to unlock the mysteries beyond our earth and to go beyond the frontiers of generations past, we have now got used to justifying space missions for their perceived military or medical value. This has affected policy to such a great extent that we choose space stations that add little to our understanding or sense of our place in the universe against grander missions into outer space.

When the pursuit of science is justified in terms of technological dividends, it advances the cause of neither science nor technology. The greatest contributions to technological progress have come from science that is done for its own sake. Taking the long view to appreciate the historical differences between the two and the different purposes they serve in enriching human life can help us put today’s connections between them in perspective. The pursuit of technology and material progress is a choice. But scientific temper and understanding provide succour to the soul and is a necessity. We should be vigilant not to make support for the latter contingent on our desire for the former.

 

Graphene ICs around the corner? January 3, 2010

Filed under: Electronics,Semiconductors — Rāhul @ 13:51
Tags:

2009 marked the 50th anniversary of the Physicist Richard Feynman’s speech to the American Physical Society in 1959 where he foresaw the coming age of nanotechnology. Much of his vision of atomic level fabrication and nanoscale surgical robots in the bloodstream are yet to be realised. However, the Electronics industry has surged relentlessly down this path, following the now famous Moore’s law which says that the number of electronic building blocks called transistors on a chip roughly doubles every two years.

Ever since 1965 when Gordon Moore, the co-founder of Intel, foresaw this exponential decrease in the sizes of individual transistors on a computer chip, this unbridled advance of electronics has allowed progressively more powerful and compact computers. But as applications and user aspirations demand ever more powerful computers, Moore’s law is finally coming against the brick wall of Quantum Physics. Transistors on today’s chips are only of the order of a millionth of an inch, nearing the subatomic level where large scale physical laws break down and the uncertainties of Quantum Mechanics take over.

Scientists are now looking at ways to redesign the transistor in such a way as to take advantage of the quantum laws. All such approaches mean a shift from Silicon, which is used near-universally today. One promising new material is Graphene which is a one atom thick sheet of Graphite, one of the pure forms of elemental Carbon.

Graphene was only isolated in 2004 by peeling off a layer from Graphite using scotch tape. Since then, ways have been found to grow Graphene on Silicon Carbide and to deposit it from a solution. The thermal and electronic properties of Graphene have been found ideal for transistor operation. Current carriers in Graphene can travel very fast while picking up very low noise. Very high ability to conduct heat also makes it attractive in densely packed integrated circuits which need to dissipate heat efficiently.

Integrated circuits found on computer chips are essentially a clever combination of millions of transistors and other electronic elements in a logical circuit that achieves a certain output depending on the inputs. Individual transistors, connected to form electronic switches lie at the heart of the circuit. Mechanical switches can be flipped on and off to control the flow of current through a circuit. Similarly, transistors have a channel through which the flow of current is regulated by the voltage at a terminal called gate. When the gate voltage is flipped between high and low, the transistor channel can be flipped on and off. By cascading many thousands of such switches or gates, complicated logical operations can be performed electronically.

Although single transistors of Graphene had already been demonstrated, it was only recently that a Graphene Gate was demonstrated. Floriano Traversi and Roman Sordan from Politecnco di Milano and Valeria Russo from the Department of Energy, Micro and Nano structured materials laboratory, both in Italy reported in Applied Physics Letters, their demonstration of a Graphene Inverter. An Inverter is the most basic logic gate, which gives a low output if the input is high and vice versa, essentially inverting the input at the output.

The researchers deposited two adjacent Graphene layers on a Silicon substrate cove red with oxide to form the channels of two different transistors. Metal contacts were then formed behind the Silicon substrate to act as the controlling gate and in between the Graphene layers so that they were connected end to end. In an ingenious step, they then electrically annealed just one of the graphite channels so that the gate voltage at which it flips from off to on changed. If a constant voltage is now applied across both the transistors together, the output voltage at the terminal they share will depend on which transistor is off and which is on. By controlling the gate voltage, the scientists then controlled the channel conductivities and hence the output voltage.

Tying together many such switches in a circuit similar to the Silicon Integrated Circuits, much faster computer chips with lesser noise can be obtained. But this technology is yet only in a nascent stage. Output voltage from the demonstrated gate does not switch between values that can be directly fed to another gate input. Also, unlike today’s Silicon switches, this Graphene gate cannot be turned fully off shutting off current completely. So, they dissipate power even while not switching.

Silicon electronics has matured today after continuous improvements over half a century. The intense research interest and rapid progress in Graphene based devices and this demonstration of the feasibility of Graphene integrated circuits indicates that post Silicon Electronics might be just around the corner. Moore’s law will probably continue its resilience into the near future.

 

Trailblazing December 20, 2009

Filed under: Uncategorized — Rāhul @ 20:39
Tags: ,

The Royal Society of The UK is celebrating its 350th anniversary this year. In this regard, they have made available some of the trailblazing leaps in Science, as seen in the proceedings of the Royal Society over the years, freely on the internet. From Isaac Newton’s theory of Light and Colours in 1672 to Benjamin Franklin flying a kite in an electric storm in 1752, Bayes’ essay on chance in 1763, Maxwell’s theory of the Electromagnetic field in 1865, Dirac’s theory of the electron in 1928 and Watson and Crick’s DNA structure in 1954.

It was interesting to me how the only example from the 21st century they chose to highlight was a paper on Geoengineering. I wonder if this amounts to an endorsement for research into the field by the Royal Society. With the current international impasse on emissions reductions, it is very likely that Geoengineering will become increasingly prominent in the near future.

Partly due to my background in Electrical Engineering, my favourite paper among all the highlights is Maxwell’s masterpiece tying up Electricity and Magnetism into the unified Electromagnetic theory. Although the paper itself makes difficult reading today, even a cursory look betrays the rigour of the analysis and genius of Maxwell. Even today, it is perhaps the most elegant unification in history of forces and fields that were previously thought to be separate. For the sake of reductionist Physics and the intellectual clarity that goes with it, I hope it doesn’t remain so much longer!

 

All the room at the bottom December 8, 2009

Filed under: General Physics — Rāhul @ 18:42
Tags:

In December 1959, 50 years ago this month, Richard Feynman gave a talk to the American Physical Society at Caltech. Titled “There’s plenty of room at the bottom”, it laid out the promise of the as yet unborn field we call today nanotechnology, and challenged physicists to turn their attention to unlocking the consequences of the laws of physics at this small scale. The potential of nanotechnology is widely recognised today and significant efforts and funding are directed to it. On this occasion of the 50th anniversary, I would like to review briefly the orignal talk by Professor Feynman and to explore how it has shaped nanotechnology research.

Feynman starts the talk appreciating the unique journey of an experimentalist who makes the first inroads into a hitherto unreachable field like Kamerlingh Onnes in low temperature physics and proposes as a similar area, the “problem of manipulating and controlling things on a small scale”. He then goes on to lay out the interesting challenge of writing the entire 24 volumes of the Encyclopaedia Britannica on a pinhead by reducing all its writing linearly by a factor of 25000 and in the same vein, of having all the information in the great libraries in a small block that can be carried about. Then, he talks about using codes of a few atoms instead of letters and symbols as a way to compress information to even smaller dimensions, which he illustrates as showing the “plenty of room” that is at the bottom. The central advance in technology that Feynman anticipates would drive all this is a better electron microscope. In 1959, electron microscopes could resolve dimensions as low as 1 nm. He challenges physicists to reduce this to 10 pm, an improvement of 100 times, which will help us look at and manipulate individual atoms.

Throughout the lecture, Feynman only described possibilities that follow the laws of Physics as then understood, but were beyond the realm of technology. He focussed on the effects of miniaturisation on computers. In the 50s, Computers with relatively few circuits filled entire rooms. If all the devices and circuits were to be made at the atomic level, he suggested that we could have computers with far more complicated circuits in a smaller space, which is exactly what we have today. Then, he talked of how the problems of lubrication and heat dissipation would scale in a favourable way at small dimensions. He also talked about the possibility of nanorobots entering the blood stream to conduct surgery, an idea that has since received considerable play in Science Fiction. Adressing the problem of assembling at the nano level, he suggested using a cascade of master-slave connections, either mechanical or electrical, that would progressively assemble at smaller and smaller levels and identified the need to improve the precision of the apparatus at each stage. As the final frontier, he considered the problem of re-arranging atoms themselves so as to create from elements and compounds to minerals and virtually anything. He ended by talking about how the physical laws are very different at such a small scale and announcing prizes for a technology challenge in this direction.

Although his groundbreaking work in Quantum Electrodynamics was well behind him, Professor Feynman didn’t then enjoy the public reputation of the supremely brilliant and erudite yet witty and charming scientist that he does today.  So, it is interesting why so many papers in nanotechnology quote this lecture as the beginning of the field. There is no direct link between the talk and the various advances that came later. But in many ways, Feynman has been prophetic. The electron microscope can today resolve down to 50 pm, which is as good as a biologist needs. Computers have indeed packed more and more circuits, devices and memory into shorter areas and grown powerful and complicated. But his vision of nano-level assembly and surgery don’t seem any closer today than when he talked about them.  In a series of articles this month, Nature Nanotechnology points to how a nascent field looked to this lecture as a focal point which drove the enormous advances that we have seen in the last few decades. While Feynman got a lot right through his crystal ball, he also got some which aren’t right yet!

Through the whole talk, the reader (and the listener, I am sure!) can sense the scientific zeitgeist of the 50s,  which was a reductionist viewpoint where everything could finally be analysed by a set of physical laws. Chemistry, Biology and other studies, it was thought, could eventually be reduced to Physics and once we had all the fundamental physical laws, we could build everything else from them. Although this point of view still holds much water and an incessant romantic sway, it is undeniable that the major advances of the last few decades have been in Biology, Psychology and Neuroscience and even many Physicists are today taking an emergent, rather than reductionist, view of the science. It can be argued that this signifies a failure of the vision and intellectual firepower required to make fundamental advancements. Perhaps, we will again return, with a momentous discovery, to the reductionist viewpoint. But for now, Science continues to look where the light is for the needle lost in the dark and tries to push the frontiers of the lighted area ever so much outwards. Maybe it will be the ability to manipulate things on an atomic scale that will eventually lead us  to the next great leap forward!

 

Heating and Car Mileage November 17, 2009

Filed under: General Physics — Rāhul @ 23:51
Tags:

We are all aware of the mileage hit we unwillingly take when we decide to run the AC on a hot day in the car. But, what about the reverse? What effect does running the heater have on petrol mileage? One of my friends recently brought up this topic and it made for some interesting discussion.

I don’t usually use heating in my car in the winter because I am usually dressed warm and my car rides are pretty short anyway. So, I haven’t had the opportunity to try out my hypothesis (to follow) by trying to mimic other conditions and see if I can get more or less miles for a full tank of Petrol if I have heating on. But, I do manage a pretty high mileage for my ’93 Corolla (nearly 30 miles to the gallon in cold weather with 50%ish highway miles), which could be related to my heating preference.

In my analysis below, the key fact to remember is that the mileage achieved by a car is a function of the ambient temperature. At normal ambient temperatures, a controlled quantity of fuel is injected into the cylinder during each cycle, which is then lit up to produce power. But, when temperature is very low, the car lets in a higher quantity of fuel every cycle in an effort to compensate for the lower temperature by having a higher fuel to air ratio. As temperature decreases, the air in the chamber needs to be more rich in fuel for the fuel to burn at the same rate (in terms of power produced per cycle) as at a higher temperature.  So, we see that when ambient temperature goes down, fuel efficiency reduces too. This is the reason why we see a large drop in petrol mileage in cold weather.

Fuel efficiency decreases also when the temperature is very high. But, this has mostly to do with the enhanced cooling needs of the engine and the passengers rather than the the internal combustion engine itself. Cooling requires energy which saps into the fuel efficiency. So, we see that there is a range of optimum ambient temperature when petrol mileage is the highest, on either side of which it tapers off. This is a feature of many systems when it comes to efficiency.

Coming back to the original question, let us look at the effect of heating on mileage is to see how it affects the temperature in the engine. Car heating systems usually work by siphoning off a part of the heat generated in the engine to warm the passenger area. In winter, especially right after the car is started, the temperature in the engine is much below the optimum range. If at this time, some of the heat being generated is diverted to heat up the passenger cabin, it will reduce the fuel mileage. But, if the car had been running for a while and the engine already above the optimum heat range (unlikely in harsh winters and short drives), then the fuel mileage will increase if the engine is cooled by letting some of its heat out. Of course, when the heat is on, there will be a fan which channels the heat as required. This fan is a drain on the engine whatever the temperature. At sufficiently high temperatures, the mileage gain from losing some engine heat is higher than the mileage loss from the fan. But then I am not sure why I would use the heater on such a hot day!

This post goes awry from the stated purpose of the blog. Rather than trying to form scientific conclusions from observed facts, this post tries to use a priori knowledge to predict what will happen. I hope to gather data and a make an a posteriori post on this matter sometime soon. Till then, my hypothesis is that using the heater in winter does reduce fuel mileage although the effect is likely much less than the effect of A/C in summer.

 

 
Follow

Get every new post delivered to your Inbox.