Advanced science, Cultural, International, Life as it is, Political

Science, Society and Politics

Science is a remarkable tool available to humans for understanding what is true about the world. It expanded the boundaries of our knowledge and challenged our preconceived notions of what reality is! Accordingly, scientific research has yielded a treasure trove of knowledge about many previously inaccessible domains of nature. The validity of such knowledge received confirmation from the fact that they led to new technologies that are helping us live longer, healthier and more enriching lives.

Scientific research does not take place in a vacuum. It is a social activity with a political overtone. And scientists are very much aware of the intricate interplay of science, society and politics. Perhaps one of the most persuasive arguments regarding the rightful place of science in modern society was brilliantly articulated by the American inventor and science administrator Vannevar Bush in his report Science: The Endless Frontier prepared in July 1945 for US President Harry Truman. In the report, he noted that the “social contract between science and society allows scientists alone to decide what research best serves the society.”

Having said that, the practice of science is never entirely free of politics. It makes its presence felt in science via money. While philanthropists and private foundations fund scientific research to some extent, most research is inherently shaped by the funding landscape of government, and therein lies the conflict between science and politics.

Since decisions about funding allocation are made by politicians, deciding what type of science a scientist should do is no longer a scientific one, but a political one. Furthermore, there are examples of politicians punishing or favouring scientists for ideological reasons. A case in point is Trofim Lysenko, a Russian agronomist and biologist, whose work was enthusiastically endorsed by the Soviet government under Stalin because his theories supported the principles of Marxism. Hence the term Lysenkoism, used to reference the manipulation of the scientific process to achieve ideological goals. On the other hand, the work of Andrei Sakharov, who holds an honoured place in the pantheon of distinguished physicists, was discredited by the Soviets because of his dissident humanitarian voice.

In the wake of the Covid-19 pandemic, which has so far claimed nearly 1.2 million lives worldwide, the relationship between science and politics is now smack at the centre of the world stage. While the world looked up to the United States to lead the fight against Covid-19, President Donald Trump, defying science, played down the severity of the virus by saying “It is what it is.” Not surprisingly, there is a surge of new cases in the USA, while leaders of countries who are carefully straddling the fine line between science and politics managed to contain the spread of the virus.

Regardless, scientists are working tirelessly to develop Covid-19 vaccines. Trials are underway, testing the BCG vaccine to see if it can provide at least temporary protection against the virus, marking the first time a vaccine is being tested against a specific pathogen other than the one it was designed for, which is tuberculosis. At the same time, researchers in the United Kingdom found that patients injected with T-cells, which are white blood cells that are of key importance to our immune system, responded positively to the Covid-19 virus.

Another example of the conflict between the value-laden space of political decision-making and the factual, objective world of science is climate change. Scientific evidence of climate change has helped to create a robust social and political debate about reducing greenhouse gas emissions. However, instead of responding positively to the debate, leaders of the fossil fuel producing countries are focusing on the uncertainties of climate models, or rejecting outright the findings of scientists, thereby sowing seeds of doubt about what constitutes “good” science.

Nevertheless, scientists are trying to convince politicians that it would serve all of us well if they use scientific facts as neutral information to guide public policy. Lest we forget, politicians need the knowledge that scientists possess in order to give us a decent shot at enjoying the full benefits of living in a high-tech world. Otherwise, they risk making ill-informed decisions on issues that are highly technical and complex.

Politics aside, scientific research and innovation are principally responsible for decades of economic growth and medical advances. Indeed, scientific discoveries, along with advanced techniques and instruments developed by scientists, particularly physicists, in the past 100 years or so have ushered in a new era in medical science.

The era began in 1895 with the discovery of X-ray, used today as a diagnostic tool to see through different parts of our body. Imaging by X-ray was dramatically improved after the invention of the computerised tomography. Other technologies, for instance nuclear magnetic resonance, are allowing us to recover from life-threatening illness which in the past would have been fatal. Additionally, positron emission tomography, or PET scan, developed after the discovery of positron—the anti-particle of an electron—allows doctors to check for diseases in our body, as well as help them to see how well our organs and tissues are working.

The advances in laser physics have also made considerable impact on medical research. Soon after the advent of lasers in 1960, they found their way into medical applications, namely ophthalmology, dermatology, cosmetic surgery, oncology, dentistry and more. More importantly, lasers allow surgeons to work at high levels of precision by focusing on a small area, damaging less of the surrounding tissues.

We could not do without radioactive materials in today’s world, even if we wanted to. Radioactive isotopes, discovered in the early 20th century, are an integral part of nuclear medicine and are commonly used to treat some cancers and medical conditions that require shrinking or destruction of harmful cells.

The use of nanotechnology in medical sciences is a rapidly expanding field. Originating from the Greek word nanos (dwarf), “nano” describes length scales of the order of a millionth of a millimetre. Although this field is still in its infant stage, there is a growing interest among the medical community to use the technology for targeted drug delivery, cancer treatment, nano-biosensors and nano-medical imaging.

The discovery of graphene in 2004 is among the highlights in materials science and nanotechnology. It is a sheet of carbon atoms just one atom thick, arranged in a honeycomb-like lattice with amazing physical and chemical properties. Graphene has potential applications in a wide range of areas of biomedical sciences. Chief among its applications is DNA sequencing, the gold standard for successful diagnosis of various diseases.

In 1938, when physicists successfully split (fission) the atomic nucleus, it gave humanity access to something extremely potent: the tremendous amount of energy released during the fission process. Immediately recognised as the basis for weapons of mass destruction, it is now used to generate around ten percent of the world’s electricity.

The letter “h” introduced by Max Planck in 1900 to explain the spectra of thermal radiation is the fundamental constant of quantum theory. Because this constant governs the scale of the quantum effects in the subatomic world, it had profound ramifications in technology. For example, it enabled the construction of microcircuits, quantum computers, transistors and semiconductors, lasers, iPods, cell phones and digital cameras that have changed the trajectory of our life from ordinary to extraordinary.

It is now almost impossible to get lost whether we are on land, sky or ocean, thanks to Einstein’s special and general relativity theories, which play a big role in the design of Global Positioning System satellites that give accurate readings of position, speed and direction of an object in real-time. The satellites would fail in their navigational functions if the relativistic effects of time dilation and spacetime curvature in their clocks are left uncompensated.

A final thought on the World Science Day for Peace and Development. In the past, scientists who challenged politicians for ignoring their advice have been accused of behaving unethically. But as we stare down the barrel of an ongoing global pandemic, we should realise that society forms politics, politics controls science and science inform both society and politics. So, as we move forward, a harmonious relationship between the three is ever more important in today’s fractious world.

Quamrul Haider is a Professor of Physics at Fordham University, New York.

Advanced science, Cultural, International, Life as it is, Technical

Are teachers the “Luddites” of higher education

It is obvious that online education has cut out the bricks and mortar frills of a normal campus and replaced classrooms with a computer screen on top of a desk at a student’s home.

luddites

According to tech-employment experts, more than half the jobs in the United States would be automated in a decade or two. That should not come as a surprise. Robots are already working as telemarketers, replacing assembly line workers, whisking products around Amazon’s huge shipping centres, diagnosing medical conditions and performing minimally invasive surgeries. They are writing stories for newspapers and magazines, too.

For all of its ambiguities, technology has also made its way into the arena of higher education. Today, we are fascinated by videotaped lectures. We revel at the online learning format MOOC ‒ acronym for Massive Open Online Course. We rave at Coursera ‒ a venture-backed, education-focused technology company. We rant about Udacity ‒ a for-profit educational organization offering MOOCs.

Courses offered by these asynchronous programs do not take place in a real-time environment. As a result, there is no class meeting time. They enrol tens of thousands of “followers,” a Twitter term I prefer to use, because it offers a more apt label than “students.” The followers are provided with syllabi and assignments and are given a time frame to complete the course work and exams. Interaction with instructors usually takes place through discussion boards, blogs and Wikis.

So, what happens next? One clue might lie in the early nineteenth century Britain when the intrusion of mechanized technology into the textile production process ignited the Luddite rebellion, named after Ned Ludd, a mythical weaver who lived in Sherwood Forest. He supposedly broke two mechanical knitting machines to vent his anger against automation.

Incensed at the machines that they believed would replace them, the textile workers or the Luddites, as they were called, raided factories and sabotaged machinery by night, in the hopes of saving their jobs. The rebellion was a total failure. Nonetheless, the Luddites bequeathed us a namesake pejorative hurled at anyone daring to stand in the way of technological progress. The term Luddite has now become a synonym for technophobe.
I write this piece not as a technophobe, but as an open-minded professor sceptic about technology’s impact on the state of higher education. I have enthusiastically experimented with YouTube clips, Facebook course pages and discussion blogs in many of my courses. I appreciate the word processors, particularly TeX/LaTex ‒ a high-quality typesetting system designed for the production of technical and scientific documentation. I value the usefulness of the Internet that gives me access to a treasure trove of information on an untold number of subjects, as well as technical journals essential for doing research. As a theoretical nuclear physicist, I am grateful for the open-source software, such as Mathematic that took the sweat out of high-level mathematical calculations.

Nevertheless, I am also disturbed by some aspects of online education’s impact on learning and scholarship. That is because from the vantage point of science pedagogy, technologies have still to offer an adequate answer to a question that should always be at the forefront of our conversations: How much does the whole person matter?

It is obvious that online education has cut out the bricks and mortar frills of a normal campus and replaced classrooms with a computer screen on top of a desk at a student’s home. While proponents of online learning would like us to believe that their ostensibly laser-like focus on higher education is admirable, one cannot help but wonder about the value of the traditional liberal arts college experience that is lost in the process.

As many have noted, the experience of a lecture hall ‒ usually a metaphor for college as a whole ‒ has not changed all that much in the last 500 years or so. Standing astride at the podium or writing on a chalkboard, professors edify the students by pouring forth their knowledge. Whether the endurance of this long-established format is either a virtue or vice depends on how close your postal code is to California’s Silicon Valley.
Against this age-old backdrop, enter the heroic innovators ‒ the techno utopians. In their view, online learning offers a solution to the various crises higher education is facing today.In particular, it accommodates adaptable scheduling, comfortable learning environment, variety of programs and courses to choose from, and strips down costs, so that education could be spread to people not privileged enough to afford the sticker shock of today’s tuition fees.

Is online learning really making good on the promises the techno utopians are claiming? Numerous studies over the years have shown that technology hurts students’ progress more than it helps. The studies conclude that students who rely solely on modern technology to get their degree in quick and easy doses often lack the ability, and more importantly patience, to think and study the old-fashioned way. They belong to a generation of digital natives who are apparently incapable of prying themselves away from their computer screens for even a 50-minute classroom lecture.

Furthermore, recipients of degrees from online educational facilities should be prepared to face a few initial hiccups, simply because there is a greater likelihood that their degree would be considered to have much lower value than the one obtained via mainstream classroom education. Consequently, prospective employers may be sceptical about the credibility of even well-known online learning enterprises that generally offer only certificates of course completion. They are, however, appropriate learning environments for adults with time constraints or busy schedules, or those who want to take enrichment courses to enhance their career.

Others have noted that online course innovations seem uniquely tilted in favour of fields like science, engineering and mathematics and less suitable for subjects like history, philosophy, or English. In that sense, technology has a bit of bias, as any bleary-eyed humanities professor who cannot feed a stack of essays into Scantron will tell us.
Even if online learning does get better at spreading knowledge, can it ever match college’s time-honoured strength in cultivating wisdom? Confronting that challenge requires us to answer the question of how much the whole person really matters. Technology seems to suggest it does not and should not. Indeed, the ideology of technology is to disaggregate the whole person ‒ to stretch human faculties to the point where space and time become irrelevant.

Arguably, college, at its best, is all-encompassing. It is a place where one undergoes intellectual, social and spiritual transformation. Yes, education happens in the lecture hall. An ineffable, unpredictable vibe that a great class discussion generates leaves its participants buzzing.

But education also happens on a theatre stage, in museums and art galleries, at an atelier, at a research lab at a hospital, in the study abroad program and many other places outside the classroom. It remains unclear how MOOC, Coursera, Udacity, or technology in general can help cultivate wisdom across all of these fronts and thus enrich the whole person that college education epitomises.

Although we may be at the dawn of a post-human era, as some have argued, I do believe that we are losing more than we are gaining from a technological hypnosis that has the potential to reclassify the teacher as a network administrator. If we could avoid bowing to the pressures to convert higher education into virtual reality, we will preserve something essential to our humanity, a sense of community.

To that end, we still need to be face-to-face with the students, to meet with them in groups for discussion, or to have one-on-one meeting with a student seeking guidance. These relational roles and human touch of a teacher can only be accomplished in a campus environment.

Having said that, are we facing our own virtual obsolescence just like the Luddites? Only time will tell whether we will become neo-Luddites or not. However, if the prediction and vision of automation is even halfway correct, I am afraid higher education in a campus setting may soon become redundant, as techno utopians are forecasting, when a one-size-fits-all online education presents itself to institutions looking to streamline the overhead. If that day arrives, it won’t just be faculty’s loss; it could be a loss of our students’ sense of wholeness too.

 

The writer is Professor of Physics at Fordham University, New York