Lesson AWF-3

Welcome! Please consider how your attitude affects your and other students' experiences of the lesson. 

Be respectful, come prepared, and show interest to have the best possible educational experience. 

Lesson goals


Test your vocabulary by taking a vocabulary test, read scientific articles and develop your reading and writing skills by writing a short analysis using sources.

Lesson activities


Vocabulary quiz

Reading comprehension

Writing practice

Vocabulary quiz


Please wait for Feke to share the test code.

Rephrasing - how it works

Read the first sentence. Complete the second sentence so that is has a similar meaning to the first sentence by putting the word in bold in the blank space (...) . Change the word if necessary. You must use between two and five words. For example:


Reading comprehension


Should we welcome robot teachers?

Should we welcome robot teachers?

Amanda J. C. Sharkey

Ethics and information technology 18, pages 283–297 (2016)


Abstract

Current uses of robots in classrooms are reviewed and used to characterise four scenarios: (s1) Robot as Classroom Teacher; (s2) Robot as Companion and Peer; (s3) Robot as Care-eliciting Companion; and (s4) Telepresence Robot Teacher. The main ethical concerns associated with robot teachers are identified as: privacy; attachment, deception, and loss of human contact; and control and accountability. These are discussed in terms of the four identified scenarios. It is argued that classroom robots are likely to impact children’s’ privacy, especially when they masquerade as their friends and companions, when sensors are used to measure children’s responses, and when records are kept. Social robots designed to appear as if they understand and care for humans necessarily involve some deception (itself a complex notion), and could increase the risk of reduced human contact. Children could form attachments to robot companions (s2 and s3), or robot teachers (s1) and this could have a deleterious effect on their social development. There are also concerns about the ability, and use of robots to control or make decisions about children’s behaviour in the classroom. It is concluded that there are good reasons not to welcome fully fledged robot teachers (s1), and that robot companions (s2 and 3) should be given a cautious welcome at best. The limited circumstances in which robots could be used in the class-room to improve the human condition by offering otherwise unavailable educational experiences are discussed.


Conclusions

Now that we have considered the main ethical issues raised by, and the reasons in favour of, classroom robots, some implications about the relative acceptability of the four classroom robot scenarios can be drawn. These conclusions are based on the current and likely near future abilities of social robots, and it is acknowledged that they might need to be revisited if robots with significantly greater abilities are developed.

There are reasons to support the use of Telepresence robots (Scenario 4) when they are used to provide educational opportunities that would otherwise be inaccessible. For instance, they could be used to facilitate children’s access to remote skilled teachers unavailable in their school. Their use as a cost-cutting measure should still be viewed with suspicion, and they do give rise to concerns about privacy and sharing of information, but nonetheless they could usefully supplement regular classroom teaching in some circumstances. Their use to facilitate contact with teachers and speakers of a foreign language seems appropriate, and if they are deployed in a classroom in which a human teacher is also available, there would be less need to be concerned about the issues of control and autonomy, and attachment and deception.

Companion and peer robots designed to foster implicit learning (Scenario 2 and 3) seem quite likely to appear in schools because they can function under the auspices of the human teacher without the need to control the classroom, or to appear fully competent. If such robots are to be welcomed, their welcome should be a cautious one because of the need to establish the educational effectiveness of such measures, particularly when compared to cheaper alternatives such as educational software and virtual coaches. In addition, since such robots masquerade as children’s friends, there are concerns about the extent to which they would violate their privacy, and a risk that they would have a deleterious impact on their learning about social relationships. Nonetheless, if concerns about privacy and social relationships were addressed, it is possible that such robots could be used to offer new educational opportunities. For example, the idea of developing a care-eliciting robot that encourages children to teach it new concepts or skills (and thereby reinforce their own learning) seems a promising one. Similarly companion robots could be developed to provide individualised practice for children on tasks that require repetition (and that might be too dull or time consuming for human teachers). It also seems plausible that children might be more willing to admit a lack of understanding, or a need for repeated presentation of material to a robot than to a human adult. 

The use of fully fledged robot teachers (the extreme of Scenario 1) is surely something that should not be encour-aged, or seen as a goal worth striving for. There seems no good reason to expect that robot teachers would offer extra educational benefits over a human teacher. It is also apparent that robot teachers will not be able form an adequate replacements for humans in the near future. Robots are unlikely to have the ability to keep control of a room full of children in the absence of a human teacher (except in a nightmare situation where they could administer physical restraint and punishment to make up for their own shortcomings). A robot could be programmed to deliver educational material, but it is not at all clear that children would learn that material once the initial novelty of the robot teacher had worn off. In addition, even if it were possible to program robots to deliver a curriculum, that would not make them good teachers. A good teacher should be able to identify the zone of proximal development for a child, and be able to teach them just what they need to know, just when they need to know it (Pelissier 1991). As discussed by Sharkey (2015), a robot is unlikely to be able to determine the relevant information to teach to a student in any meaningful way. As non-humans, how could robots determine what human children need to know, or have the intention to pass on the information that is needed to accomplish the tasks required in human culture (Kline 2015)? First and foremost, children need to be taught by fellow human beings who understand them, care for them, and who form appropriate role models and attachment figures.

Source: https://link.springer.com/article/10.1007/s10676-016-9387-z 

Experiments that led to the first gene-edited babies

Experiments that led to the first gene-edited babies: the ethical failings and the urgent need for better governance

Jing-ru Li, Simon Walker, Jing-bao Nie & Xin-qing Zhang 

Journal of Zhejiang University-SCIENCE B volume 20, pages32–38 (2019)


Abstract

The rapid developments of science and technology in China over recent decades, particularly in biomedical research, have brought forward serious challenges regarding ethical governance. Recently, Jian-kui HE, a Chinese scientist, claimed to have “created” the first gene-edited babies, designed to be naturally immune to the human immunodeficiency virus (HIV). The news immediately triggered widespread criticism, denouncement, and debate over the scientific and ethical legitimacy of HE’s genetic experiments. China’s guidelines and regulations have banned germline genome editing on human embryos for clinical use because of scientific and ethical concerns, in accordance with the international consensus. HE’s human experimentation has not only violated these Chinese regulations, but also breached other ethical and regulatory norms. These include questionable scientific value, unreasonable risk-benefit ratio, illegitimate ethics review, invalid informed consent, and regulatory misconduct. This series of ethical failings of HE and his team reveal the institutional failure of the current ethics governance system which largely depends on scientist’s self-regulation. The incident highlights the need for urgent improvement of ethics governance at all levels, the enforcement of technical and ethical guidelines, and the establishment of laws relating to such bioethical issues.


Conclusion

In conclusion, gene editing techniques are not sufficiently safe or effective to be used on human reproductive cell lines. Evidence for the safety and effectiveness of this technology can only be obtained through basic and preclinical research, on the basis of strictly following technical standards and ethical norms. The construction of regulations and laws should be accelerated to meet the rapid development of emerging biotechnologies. Existing technical and ethical guidelines should be refined and more rigorously enforced to guide and standardize relevant research and applications. The lessons of illegal stem cell therapy in the Chinese mainland in recent years should not be forgotten, and stakeholders must take actions on regulating CRISPR-Cas germline editing as early as possible (Zhang, 2016).

The academic community should respect the dignity of human life and remain sensitive to the risks that research can present to participants and the wider community. Biomedical researchers and practitioners must abide by the relevant regulations and laws and firmly hold to the well-established ethical guidelines for the safe translation of scientific results to human health. Building the ethical review capacity at all levels should be strengthened. Ethics education and training should be provided to researchers, medical practitioners, and EC members, and education programs on science and ethics should be provided to the general public.

We have entered the era of human gene therapy. Somatic gene editing has been used on patients for a long time, and it has helped improve the lives of cancer patients and patients with inherited genetic diseases (Qiu, 2016; Dunbar et al., 2018). The clinical application of human germline gene editing is in the near future. We appeal to policy-makers to pay seri-ous attention to the relevant issues, actively confront the challenges, and come up with a responsible and feasible pathway for clinical translation of human germline gene editing. Contemporary bioethics governance of human germline gene editing and other areas must by definition be transnational and global. More transcultural dialogues between China, the West and the rest of the world are much needed (Nie and Fitzgerald, 2016).

Source: https://link.springer.com/article/10.1631/jzus.B1800624 

Writing practice

Time for the activity: ~40-60 minutes


1. Write a 200-300 word text where you discuss one of the topics: robotics, or,  genetic engineering

2. Write a paragraph that consists of three main ideas:

3. Restate your prediction, compare it with your source and answer your thesis question: How optimistic are you about robotics/genetic engineering?

4. Finally, submit the text to Unikum for feedback. Later, this text can also be used when you write your final A Warning to the Future-essay.


For you who want to develop even further:

Example text


Despite its immense potential to transform various aspects of our lives, I remain cautious about the future of robotics. While robotics offers the promise of automating tedious and dangerous tasks, enhancing productivity, and improving healthcare, the ethical implications of its widespread adoption raise significant concerns.


Robotics raises several ethical concerns. Amanda J. C. Sharkey's article "Should we welcome robot teachers?" aptly highlights the potential for privacy violations, unhealthy attachments, and the loss of human connection when robots are integrated into sensitive environments like schools. Sharkey's work underscores the importance of careful consideration before fully embracing robotics, particularly in areas that demand human empathy and social interaction, especially when considering  "a nightmare situation where they could administer physical restraint" (Sharkey, 2016). While Sharkey's article raises valid concerns, my optimism about robotics is tempered by the need for responsible implementation, especially when considering. The potential benefits of this technology are undeniable, but the risks of unintended consequences and the exacerbation of existing societal challenges cannot be overlooked. As we continue to develop and refine robotic technologies, it is crucial to prioritize ethical considerations and ensure that robots augment, rather than replace, human capabilities.


Ultimately, the impact of robotics will depend on our choices. If we approach robotics with a focus on ethical principles and human well-being, it can serve as a powerful tool for progress. However, if we fail to address the ethical concerns, which I strongly believe will arise, robotics could lead to unintended consequences and exacerbate existing societal challenges. As we venture into the age of robotics, it is imperative to strike a balance between innovation and responsibility, ensuring that this technology serves as a force for good, since it currently points at mainly introducing new and more terrifying issues.


Received feedback: 


Complexity:


The overall complexity of your text is good, with a mix of complex sentence structures and vocabulary.

Consider incorporating more varied sentence structures to enhance complexity further.

Accuracy:


Your text is accurate in conveying your thoughts and concerns about robotics.

Ensure precision in language use; for instance, in the phrase "the risks of unintended consequences," consider specifying examples of unintended consequences to add clarity.

Fluency:


The text flows well and maintains a logical progression of ideas.

To enhance fluency, consider varying sentence lengths and structures even more, ensuring a smooth transition between ideas.

Suggestions for Improvement:


Introduce more diverse sentence structures, including a mix of short and long sentences, to add variety and complexity.

Provide specific examples when discussing risks or unintended consequences to offer more clarity and specificity.

Consider incorporating transitional phrases to strengthen the flow between different sections of your text.

Quote

Use original content, keeping the exact wording from the source, shown by quotation marks ("").

Paraphrasing

Rewrite information using your own words. The content should to stay as close to the source as possible.

Summary

The original content is heavily condensed with the goal of capturing the main ideas.

Homework


Submit the text on Unikum.

Exit ticket


No exit ticket; your submitted text counts as one.