A formal definition of religion is notoriously difficult to formulate, but it must surely involve reference to a particular way of life, practices oriented toward a conception of how one should live. “You must change your life,” as the broken statue of the god Apollo seems to say in Rilke’s poem. Science does not—it isn’t designed to—recommend approaches to what Emerson calls “the conduct of life.” Nevertheless, Richard Dawkins claims that religion “is a scientific theory,” “a competing explanation for facts about the universe and life.” This is—if you’ll forgive my theological jargon—bullshit.
Science and religion ask different questions about different things. Where religion addresses ontology, science is concerned with ontic description. Indeed, it is what Orthodox theologian David Bentley Hart calls their “austere abdication of metaphysical pretensions” that enables the sciences to do their work. So when, for instance, evolutionary biologist Jerry Coyne and pop-cosmologist Lawrence Krauss dismiss the (metaphysical) problem of how something could emerge from nothing by pointing to the Big Bang or quantum fluctuation, it is difficult to be kind: Quantum fluctuations, the uncertainty principle, the laws of quantum physics themselves—these are something. Nothing is not quantum anything. It is nothing. Nonbeing. This, not empty space, is what “nothing” signifies for Plato and Aquinas and Heidegger, no matter what Krauss believes. No particles, no fluctuation, no laws, no principles, no potentialities, no states, no space, no time. No thing at all.
Atheists: The Origin of the Species seems to have been born out of frustration with these and other confusions perpetuated by the so-called “New Atheists” and their allies, who can’t be bothered to familiarize themselves with the traditions they traduce. Several thoughtful writers have already laid bare the slapdash know-nothingism of today’s mod-ish atheism, but Spencer’s not beating a dead horse—he’s beating a live one, in the hope that Nietzsche might rush to embrace it. Several critics have noted that if evangelical atheists (as the philosopher John Gray calls them) are ignorant of religion, as they usually are, then they aren’t truly atheists. “The knowledge of contraries is one and the same,” as Aristotle said. If your idea of God is not one that most theistic traditions would recognize, you’re not talking about God (at most, the New Atheists’ arguments are relevant to the low-hanging god of fundamentalism and deism). But even more damning is that such atheists appear ignorant of atheism as well.
Maguire and co begin with a couple of thought experiments that demonstrate the nature of integrated information in Tononi’s theory. They start by imagining the process of identifying chocolate by its smell. For a human, the conscious experience of smelling chocolate is unified with everything else that a person has smelled (or indeed seen, touched, heard and so on).
This is entirely different from the process of automatically identifying chocolate using an electronic nose, which measures many different smells and senses chocolate when it picks out the ones that match some predefined signature.
A key point here is that it would be straightforward to access the memory in an electronic nose and edit the information about its chocolate experience. You could delete this with the press of a button.
But ask a neuroscientist to do the same for your own experience of the smell of chocolate—to somehow delete this—and he or she would be faced with an impossible task since the experience is correlated with many different parts of the brain.
Indeed, the experience will be integrated with all kinds of other experiences. “According to Tononi, the information generated by such [an electronic nose] differs from that generated by a human insofar as it is not integrated,” say Maguire and co.
This process of integration is then crucial and Maguire and co focus on the mathematical properties it must have. For instance, they point out that the process of integrating information, of combining it with many other aspects of experience, can be thought of as a kind of information compression.
Appearing at a news conference at an Osaka hotel accompanied by her lawyers, the 30-year-old researcher at the government-backed Riken institute denied allegations that she falsified or fabricated data presented in research papers published in the British science journal Nature in January.
“I sincerely apologize for suspicions raised over my research papers and the enormous trouble caused to Riken, co-authors of the papers and many others due to my carelessness, sloppiness and immaturity,” she said. “I think my mistakes were incredibly (immature) in the eyes of many researchers.
“But these mistakes do not affect the conclusions of my paper, and above all, the experiments have been solidly conducted and the data (proving STAP cells) exist.”
Wednesday’s news conference, jam-packed with hundreds of journalists and TV crews, marked Obokata’s first public appearance since Jan. 28, when she announced in Kobe that a team of Japanese and U.S. researchers led by her discovered STAP (stimulus-triggered acquisition pluripotency) cells.
So, basically: This is big, you guys! Einstein big! Nature-of-space big! Big Bang-big!
There’s just one small thing, though. The findings shared this week also share a significant caveat: They haven’t yet been peer-reviewed. They are discoveries that are, as far as scientific institutionalism is concerned, provisional. They’re stuck in a kind of epistemological limbo—as information that has not yet been converted into fact, and data that have not yet been codified into knowledge. Official status: truthy.
Masatoshi Takeichi, director of RIKEN’s Center for Developmental Biology (CDB) in Kobe, to which Obokata belongs, said: “I can only say it was regrettable. I myself can’t understand what happened.”
When the alleged results were announced at a press conference in late January, many reporters attended because it was said to be a great discovery by a 30-year-old female scientist.
CDB Deputy Director Yoshiki Sasaki attended the press conference in January with Obokata. He praised her achievements at the time, saying, “Her ability to conduct experiments and free thinking bore fruit, and she made a great discovery that deserves international praise.”
The media paid attention to the pink and yellow wallpaper in Obokata’s lab, and her kappogi, a traditional cooking smock that Obokata wears while working.
An investigative committee at RIKEN, which is funded primarily by the Japanese government, has been looking into charges that two high-profile papers published in January in the journal Nature included plagiarized material, duplicate photos and doctored figures. The papers described a new method for the speedy production of highly versatile stem cells, which the researchers dubbed STAP cells.
"The committee concluded that there had been inappropriate handling of data for two of the items under investigation, but the circumstances were not judged to constitute research misconduct," according to RIKEN’s statement.
Obokata shot to fame in Japan after the STAP papers received worldwide interest. The methods in the papers apparently provided a means to simplify and accelerate the process of making pluripotent stem cells that would be genetically matched to patients. In theory, such cells could be used to grow replacement tissues for patients with a range of diseases and chronic conditions, including diabetes, muscular dystrophy and spinal cord injuries.
Authors said they fundamentally transformed the blood cells of newborn mice by soaking them in a mildly acidic solution for 30 minutes. This near-fatal shock, they said, caused the cells to become pluripotent. They called them “stimulus-triggered acquisition of pluripotency,” or STAP, cells.
A co-author of a Japanese study that promised a revolutionary way to create stem cells has called for the headline-grabbing research to be retracted over claims its data was faulty.
The findings, published by Japanese researcher Haruko Obokata and US-based scientists in the January edition of British journal Nature, outlined a simple and low-tech approach in the quest to grow transplant tissue in the lab.
But it has faced hard questions as the Japanese research institute that sponsored the study launched a probe last month over the credibility of data used to reach the explosive findings.
At issue are allegations that researchers used erroneous image data for the high-profile Nature article.
Clinical psychology and counselling have long been rivals to orthodox psychiatry and psychoanalysis. Armed with an evidence-base rooted in experimental psychology methods, and able to point to the efficacy of certain behavioural therapies, these other psy-disciplines appeared as compacted and dense complexes of tradition and scientificity deployed in the name of committment to a pragmatics that was not wedded to arcane diagnositic nosological arguments. When CBT arrived on the scene the cheer went up: at last a scientific therapeutic approach with measurable outcomes! at last a technique that we can tweek and alter for every occasion! With the arrival of CBT came a relatively brief intervention that was also relatively cheap in terms of training and implementation, and that could be made to fit every form of psychological suffering. Today we have reached the point where it has become more-or-less accepted that CBT can be used with people diagnosed as schizophrenic, a position that was initially resisted (in part because of historical biases about the chronicity and irreversibility of psychosis that were unfounded). There is a lot to say about CBT in terms of its history, its temporality, its functional obsession with technique, its religiosity, its basic repackaging of the earlier orthodox psychoanalytic demand
that subjects be made to adapt to the world in which they found themselves. All this will be dealt with elsewhere. Here, I want to point to one of the unique features of CBT: its infinite plasticity.
In this, and other ways, the relation between psychiatry and psychology- presented as competitive and/orantagonistic- is actually based on a prior mimetic relation, operating on their mutual obsession with scientificity. There is an endless proliferation of cognitive behaviour approaches that are modelled according to this semiological modifying specification. This also carried over into the current third wave cognitive therapies, especially in those based on mindfulness such as Mindfulness-CBT and the Californian sounding “Acceptance and Committment Therapy”. These “updates” to the psy-wear appear to respond to existing criticisms of CBT, in particular the charge that CBT approaches the existential murk, the weft of subjectivation, the embodied physiology of affective phenomenology, and the sexuated nature of subjectivation by a crudely rationalist reductionism. If CBT ignored our physio-affectivity then Mindfulness & ACT seek to recognise them by way of “dwelling with” phenomenality. Patients are encouraged to “sit with” suicidal thoughts, to examine them as objective phenomena separate from themselves, and to learn to tolerate physiological states of hyperarousal (there is even a form of ACT for pain management; meditation being cheaper, but also safer, than reliance on opiates). If CBT codified subjectivity as entirely rationalist- and if it reappropriated the Freudian unconscious via a crude simplification and disavowal whereby the former’s complexity and nuance became the simplified stupidity of “cognitive schemas”- then Mindfulness and ACT return to the unconscious via meditative techniques. Yet this update is not an update but a kind of regression and intensification. Far from addressing those criticisms we’ve touched on, the Third Wave returns CBT to its philosophical roots in Stoicism, a philosophy that has been criticised again and again as quietistic, overly cognitive, cruelly heartless, and, ultimately, a strange admixture of materialist and idealist elements.
Scientific management, also called Taylorism, was a theory of management that analyzed and synthesized workflows. Its main objective was improving economic efficiency, especially labor productivity. It was one of the earliest attempts to apply science to the engineering of processes and to management.
Its development began with Frederick Winslow Taylor in the 1880s and 1890s within the manufacturing industries. Its peak of influence came in the 1910s; by the 1920s, it was still influential but had begun an era of competition and syncretism with opposing or complementary ideas.
Although scientific management as a distinct theory or school of thought was obsolete by the 1930s, most of its themes are still important parts of industrial engineering and management today. These include analysis; synthesis; logic; rationality; empiricism; work ethic; efficiency and elimination of waste; standardization of best practices; disdain for tradition preserved merely for its own sake or to protect the social status of particular workers with particular skill sets; the transformation of craft production into mass production; and knowledge transfer between workers and from workers into tools, processes, and documentation.
Scientific management’s application was contingent on a high level of managerial control over employee work practices. This necessitated a higher ratio of managerial workers to laborers than previous management methods. The great difficulty in accurately differentiating any such intelligent, detail-oriented management from mere misguided micromanagement also caused interpersonal friction between workers and managers.
The people filled with the spirit of capitalism to-day tend to be indifferent, if not hostile, to the Church. The thought of the pious boredom of paradise has little attraction for their active natures; religion appears to them as a means of drawing people away from labour in this world. If you ask them what is the meaning of their restless activity, why they are never satisfied with what they have, thus appearing so senseless to any purely worldly view of life, they would perhaps give the answer, fi they know any at all: “to provide for my children and grandchildren.” But more often and, since that motive is not peculiar to them, but was just as effective for the traditionalist, more correctly, simply: that business with its continuous work has become a necessary part of their lives. That is in fact the only possible motivation, but it at the same time expresses what is, seen from the view-point of personal happiness, so irrational about this sort of life, where a man exists for the sake of his business, instead of the reverse.
Max Weber, The Protestant Ethic and the Spirit of Capitalism
Perhaps what Atheists are really arguing for is not Science, not the use of Scientific method of inquiry, but for the primacy of Work over anything else. What atheists essentially demand of the religious, in particular the poor, rural Christians, is that they should get back to their work, back to being scientifically managed at Wal-Mart and Amazon distribution centers.
Melvyn Bragg and his guests discuss the eye. Humans have been attempting to understand the workings and significance of the organ for at least 2500 years. Some ancient philosophers believed that the eye enabled creatures to see by emitting its own light. The function and structures of the eye became an area of particular interest to doctors in the Islamic Golden Age. In Renaissance Europe the work of thinkers including Kepler and Descartes revolutionised thinking about how the organ worked, but it took several hundred years for the eye to be thoroughly understood. Eyes have long attracted more than purely scientific interest, known even today as the ‘windows on the soul’.
Melvyn Bragg and his guests discuss Social Darwinism. After the publication of Charles Darwin’s masterpiece On the Origin of Species in 1859, some thinkers argued that Darwin’s ideas about evolution could also be applied to human society. One thinker particularly associated with this movement was Darwin’s near-contemporary Herbert Spencer, who coined the phrase ‘survival of the fittest’. He argued that competition among humans was beneficial, because it ensured that only the healthiest and most intelligent individuals would succeed. Social Darwinism remained influential for several generations, although its association with eugenics and later adoption as an ideological position by Fascist regimes ensured its eventual downfall from intellectual respectability.
Remember oxytocin , the so-called “love hormone”? Over the past few years, a series of well-publicized studies have suggested its activation inspires increased trust, altruism, and empathy .
But let’s hold off on any plans to inject it into tap water. Newly published research suggests the hormone, commonly associated with cuddling, may also inspire domestic violence.
“Far from being a panacea for all social ills,” writes a research team led by University of Kentucky psychologist C. Nathan DeWall , “oxytocin may have diversified effects, increasing the likelihood that people who are inclined toward physical aggression will inflict harm on their romantic partners.”
This raises the intriguing possibility that domestic violence could be decreased if a way could be found to suppress oxytocin in people who are predisposed to violence. It also provides new evidence that the initial view of the hormone as a uniformly positive force is way off-base.
Scientists at King’s College London and the University of Melbourne have found, using brain scans, that psychological stress may be to blame for unexplained physical symptoms, including paralysis and seizures.
Patients showed differences in brain activity when they recalled traumatic memories compared with healthy volunteers in a study published in last month’s edition of JAMA Psychiatry. Besides supporting Freud’s theory and helping to explain one of the most common complaints seen by neurologists, the research may lead to new treatment approaches for patients whose symptoms were often written off by doctors in the past.
“This is the first paper that I’m aware of that really shows that previous traumatic events can definitely trigger this kind of motor response,” said John Speed, a professor of physical medicine and rehabilitation at the University of Utah in Salt Lake City, who wasn’t involved in the research. “It’s very exciting.”
(Chomsky:) What you’re referring to is what’s called ‘theory.’ And when I said I’m not interested in theory, what I meant is, I’m not interested in posturing—using fancy terms like polysyllables and pretending you have a theory when you have no theory whatsoever. So there’s no theory in any of this stuff, not in the sense of theory that anyone is familiar with in the sciences or any other serious field. Try to find in all of the work you mentioned some principles from which you can deduce conclusions, empirically testable propositions where it all goes beyond the level of something you can explain in five minutes to a twelve-year-old. See if you can find that when the fancy words are decoded. I can’t. So I’m not interested in that kind of posturing. Žižek is an extreme example of it. I don’t see anything to what he’s saying.
(Zizek:) I think one can convincingly show that the continental tradition in philosophy, although often difficult to decode, and sometimes—I am the first to admit this—defiled by fancy jargon, remains in its core a mode of thinking which has its own rationality, inclusive of respect for empirical data. And I furthermore think that, in order to grasp the difficult predicament we are in today, to get an adequate cognitive mapping of our situation, one should not shirk the resorts of the continental tradition in all its guises, from the Hegelian dialectics to the French “deconstruction.” Chomsky obviously doesn’t agree with me here. So what if—just another fancy idea of mine—what if Chomsky cannot find anything in my work that goes “beyond the level of something you can explain in five minutes to a twelve-year-old” because, when he deals with continental thought, it is his mind which functions as the mind of a twelve-year-old, the mind which is unable to distinguish serious philosophical reflection from empty posturing and playing with empty words?
..since Turing, the possible job of a programmer has run the risk of forgetting mathematical elegance. Today, prior to the conquest of digital signal processors, the hardware of average computers is at a kindergarten level: of all the basic forms of computation, it barely manages addition. More complex commands have to be reconverted into a finite, that is, serial, number of cumulative steps. An unreasonable chore for humans and mathematicians. Where recursive, that is, automatizable, functions succeed classical analysis, computation works as a treadmill: through the repeated aplication of the same command on the series of interim results. But that’s it. A Hungarian mathematician, after he had filled two whole pages with the recursive formulas according to which a Turing machine progresses from 1 to 2 to 3, and so on, observed in German as twisted as it was presice: “This appears as an extraordinarily slowed-down film shot of the computation process of man. If this mechanism of computation is applied to some functions, you start living it, you begin to compute exactly like it, only faster.” Consolation for prospective programmers…