General Data Protection Regulation is about to be applicable as from 25 May 2018. Its long-arm teritorrial reach brings obligations not only to EU establishements, but to US based companies as well. Global connection through internet especially underlines the likelihood of such broad application and it will impact US businesses.One of the prerequisits for safe transfer of data between the EU and US is already accomplished by the EU-US Privacy Shield agreement. The European Commission has considered this agreement as providing adequate guarantees for transfer of data. Under Privacy Shield scheme companies may self-certify and adhere to principles stated therein. Yet, there is still less then 3000 companies in the US participating in the Privacy Shield. But GDPR safeguards have still to be followed. Below, we shall look at some of the most profound aspects of compliance with GDPR for the US (non-EU) based companies.
Data protection officer
Although it is not obligatory pursuant the GDPR, it is advisable that a company appoints a data protection officer (‘DPO’) or designate that role to a specific position in the company. DPOcan also be externally appointed. There may be a single DPO for several companies or several persons designated with DPO role in one company. The position needs not necessarily to follow such a title, but it may be a privacy officer, compliance officer, etc. Such person should possess expert knowledge about the GDPR and data privacy, and may have legal, technical or similar background. GDPR was not specific as to requirements of that person, apart from possesing expert knowledge. Role of DPO is toinform, monitor, advise, the controller, processor or employees, to cooperate with supervisory authority, provide training of staff, help in performing data protection impact assesment.
Data Protection Impact Assesment
The further step that companies affected by the GDPR including US companies should do in order to evaluate the risk of data breach is to perform a data protection impact assesment (‘DPIA’). DPIA is a thorough overview of the processes of the company, and can be done with the help of data protection officer. It may include a form or a template with a series of questions, which have to be answered for each processing activity. DPIA has to be detailed and cover all operations in the company. The function of DPIA is to predict situations in which data breaches may occur, and which include processing of private data. DPIA should contain, pursuant to Article 35 of the GDPR, a systematic description of the envisaged processing operations and the purposes of the processing, an assessment of the necessity and proportionality of the processing operations in relation to the purposes, an assessment of the risks to the rights and freedoms of data subjects referred to in paragraph, the measures envisaged to address the risks, including safeguards and security measures. DPIA is a very useful way of showing compliance and it is also a tool that would help to company at the first place, to have an overview of processing activities and an indication of where a breach could happen.
A US company (non-EU based company) has to appoint an EU representative if its businessrelates to offering of goods or services to natural persons in the EU, including even free goods or services, or when processing is related to monitoring of behaviour of data subjects in the EU. Behaviour may include monitoring internet activity of data subjects in order to evaluate or predict her or his personal preferences, behaviors and attitudes. EU representative is not obligatory when the processing is occasional or does not include processing on a large scale of special categories of data such as genetic data, biometric data, data concerning health, ethnic origin, political opinions, etc. and when it is unlikely to result in a risk to the rights and freedoms of natural persons. However, given that the exceptions from the duty of designation of EU representative are pretty vague, in most cases companies whose operations are not neglectable towards persons in the EU would have to appoint a reprsentative. Location of such representative would be in one of the EU Member states where the data subjects are located. Representative should perform its tasks according to the mandate received from the controller or processor, including cooperating with the competent supervisory authorities regarding any action taken to ensure compliance with this Regulation, and he/she is also liable and subject to enforcement in case of non-compliance.
GDPR is overwhelmed with one key word of respect the privacy:consent. If companies wish to process data of natural persons that are in the EU, they must first obtain consent to do that. Consent must be freely given, informed, specific and unambigous.
Freely givenconsent presupposes that data subject must not feel pressured, or urged to consent, or subjected to non-negotiable terms. Consent is not considered as freely given if the data subject has no genuine or free choice.Data subject must not feel reluctant to refuse consent fearing that such refusal will bring detrimental effect to him/her. If the consent is preformulated by the controller, which is usually the case, the language of the consent must be clear and plain and easily understandable for the data subject. Further, if there are several purposes for the processing of certain data, consent must be given for every purpose separately. Consent must be specific and not abstract or vague. Silence, pre-ticked boxes or inactivity is not to be considered as consent under GDPR.
Informed consent means that data subject must know what the consent is for. He/she must be informed about what the consent will bring and there must not be any unknown or undeterminedissues. It is a duty of controller to inform data subject about scope and purpose of consent, and such information must be in clear and plain language. But, one must be careful that, as today in the world of fast moving technologies we face overflow of consentsa person has to give in short period of time, there may be an occurrence of ‘click fatigue 1’, which would result in persons not reading the information about the consent and clicking routinely without any thorough thinking. So, the controllers would have to make, by their technical design, such form of a consent, that would make the person read and understand his or her consent. It could be a combination of yes and no questions, changing of place of ticking boxes, visually appealing text accompanying consent, etc.
Consent must be unambiguous, or clearly given. There must not be space for interpretation whether consent is given for certain purpose or not. As to the form of the consent, it may be by ticking a box, choosing technical settings and similar (Recital 32 GDPR).
Data subject gives his consent for the processing of his personal data. However, companies have to bear in mind that data concept in the EU is broadly understood, and that it includes all personally identifiable information (PII), ranging from obvious data such as name and postal address, to less obvious data, but still PII covered by GDPR, such as IP address . On the other hand the IP address is not that clearly considered as PII in the US. In that regard, the protection in the US must be stricter, obliging US based companies to also apply broader EU standards.
Privacy by design implemented
Privacy by design is a concept which brings together the legal requirements and technical measures. It is a nice and smooth way of incorporating law into technical structure of business. Privacy by design, if applied properly at the outset, shall ensure the compliance with the GDPR requirements. It should point out to principles of data minimisation, where only data which is necesssary should be processed, storage limitation, which would provide for a periodic overview of storage and automatic erasure of data no longer necessary.
One of the ways of showing compliance through the privacy by design is ‘pseudonymisation’. Pseudonymization is, according to GDPR, referred to as the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information. Such additional information must be kept separately, so that it cannot be connected to identified or identifiable natural person.Pseudonymisation is not anonymisation and should not be mixed with it. Anonymisation is a technique which results in irreversible deidentification, and since it completely disables identification it is not subject of data protection under GDPR. Pseudonymisation only reduces the likability of a dataset with the original identity of a data subject, and is accordingly a useful security measure .
Binding corporate rules
Binding corporate rules (‘BCR’) include set of principles, procedures andpersonal data protection policies as well as a binding clause adopted by the company and approved by competent supervisory authority. Adopting binding corporate rules is not a simple process but means being on a safe track. It is one of the safeguards envisaged by the GDPR. BCR should include according to Article 47 of the GDPR, the structure and contact details of company, categories of personal data, the type of processing and its purposes, application of general data protection principles (such as purpose limitation, data minimisation, limited storage periods, data quality, data protection by design and by default, legal basis for processing, processing of special categories of personal data, ..), rights of data subjects, the tasks of data protection officer, complaint procedures, mechanisms for reporting to the competent supervisory authority, appropriate data protection training to personnel, indication that BCR are legally binding. BCR should additionally be accompanied with privacy policies, guidelines for employees, data protection audit plan, examples of the training program, description of the internal complaint system, security policy, certification process to make sure that all new IT applications processing data are compliant with BCR, job description of data protection officers or other persons in charge of data protection in the company.
Make your compliance visible
Well, if your company has performed all of the above, it has to make it visible. Companies, that are covered with the GDPR, not only do they have to comply, they have to show that they comply. GDPR puts an obligation on controllers to demonstrate their compliance.
From the first contact with the controller, the website must give the impression of compliance. BCR, privacy policies,DPO contact details must be visible in order that data subject may address him in case of data risk or breach. EU representative’s name and contact must be put forward in order to be accessible by the supervisory authority in the EU. Contact form for data subjects with options for access, right to object, erasure, rectification, restriction, should be there.Organisational chart of the company, flow of data transfer demonstrated by data flow mapp.These are only some of the most imporant features that have to be followed.
Non-compliance is a very costly adventure. The adventure that businesses will try to avoid. With systematic planning and duly analysing the necessity of compliance with GDPR, and with clearly defined processes, US companies can put many benefits for the business and attract and encourage data subjects in the EU to freely entrust their datato them. This is a thorough process, but worth accomplishing.
 Article 29 Working Party Guidelines on consent,p. 17
 According to judgment of the Court of Justice of the EU of 19 October 2016,in case C 582/14,
 Article 29 Data Protection Working Party, Opinion 05/2014 on Anonymisation Techniques adopted on 10 April 2014 p. 3
Central Banks Becoming Leaders in Blockchain Experimentation
Although central banks are among the most cautious institutions in the world, they are, perhaps surprisingly, among the first to implement and experiment with blockchain technology. Central banks have been quietly researching its possibilities since 2014. Over the past two years, the beginning of a new wave has emerged as more central banks launch large-scale pilots and research efforts, including rapid and complete cross-border interbank securities.
The Blockchain and Distributed Ledger Technology team at the World Economic Forum interviewed dozens of central bank researchers and analysed more than 60 reports on past and current research efforts. The findings were released today in a white paper, Central Banks and Distributed Ledger Technology: How are Central Banks Exploring Blockchain Today?
“As the blockchain hype cools, we are starting to see the real use cases for blockchain technology take the spotlight,” said Ashley Lannquist, Blockchain Project Lead at the World Economic Forum. “Central bank activities with blockchain and distributed ledger technology are not always well known or communicated. As a result, there is much speculation and misunderstanding about objectives and the state of research. Dozens of central banks around the world are actively investigating whether blockchain can help solve long-standing challenges such as banking and payments system efficiency, payments security and resilience, as well as financial inclusion.”
It is not widely known, for instance, that the Bank of France has fully replaced its centralized process for the provisioning and sharing of SEPA Credit Identifiers (SCIs) with a decentralized, blockchain-based solution. SEPA, or Single Euro Payments Area, is a payment scheme created by the European Union and managed on a country-by-country basis for facilitating efficient and secure cross-border retail debit and card payments across European countries. The solution is a private deployment of the Ethereum blockchain network and has been in use since December 2017. It has enabled greater time efficiency, process auditability and disaster recovery.
The fact that dozens of central banks are exploring, and in some cases implementing, blockchain technology is significant, according to the white paper. It is an early indicator of the potential use of this emerging technology across financial and monetary systems. “Central banks play one of the most critical roles in the global economy, and their decisions about implementing distributed ledger and digital currency technologies in the future can have far-reaching implications for economies,” Lannquist said.
Top 10 central bank use cases
Following interviews and analysis, how central banks are experimenting with blockchain can be highlighted by 10 top use cases.
Retail central bank digital currency (CBDC) – A substitute or complement for cash and an alternative to traditional bank deposits. A central-bank-issued digital currency can be operated and settled in a peer-to-peer and decentralized manner, widely available for consumer use. Central banks from several countries are experimenting, including those from the the Eastern Caribbean, Sweden, Uruguay, the Bahamas and Cambodia.
Wholesale central bank digital currency (CBDC) – This kind of digital currency would only be available for commercial banks and clearing houses to use the wholesale interbank market.Central bank-issued digital currency would be operated and settled in a peer-to-peer and decentralized manner. Central banks from several countries are experimenting, including those from South Africa, Canada, Japan, Thailand, Saudi Arabia, Singapore and Cambodia.
Interbank securities settlement – A focused application of blockchain technology, sometimes involving CBDC, enabling the rapid interbank clearing and settlement of securities for cash. This can achieve “delivery versus payment” interbank systems where two parties trading an asset, such as a security for cash, can conduct the payment for and delivery of the asset simultaneously. Central banks exploring this include the Bank of Japan, Monetary Authority of Singapore, Bank of England and Bank of Canada.
Payment system resiliency and contingency – The use of distributed ledger technology in a primary or back-up domestic interbank payment and settlement system to provide safety and continuity in case of threats, including technical or network failure, natural disaster, cybercrime and others. Often, this use case is coupled with others as part of the set of benefits that a distributed ledger technology implementation could potentially offer. Central banks exploring this include the Central Bank of Brazil and Eastern Caribbean Central Bank.
Bond issuance and lifecycle management – The use of distributed ledger technology in the bond auction, issuance or other life-cycle processes to reduce costs and increase efficiency. This may be applied to bonds issued and managed by sovereign states, international organizations or government agencies. Central banks or government regulators could be “observer nodes” to monitor activity where relevant. Early implementation is being conducted by the World Bank with their 2018 “bond-i” project.
Know-your-customer (KYC) and anti-money-laundering (AML) – Digital KYC/AML processes that leverage distributed ledger technology to track and share relevant customer payment and identity information to streamline processes. This may connect to a digital national identity platform or plug into pre-existing e-KYC or AML systems. Central banks exploring this include the Hong Kong Monetary Authority.
Information exchange and data sharing – The use of distributed or decentralized databases to create alternative systems for information and data sharing between or within related government or private sector institutions. Central banks exploring include the Central Bank of Brazil.
Trade finance – The employment of a decentralized database and functionality to enable faster, more efficient and more inclusive trade financing. Improves on today’s trade finance processes, which are often paper-based, labour-intensive and time-intensive. Customer information and transaction histories are shared between participants in the decentralized database while maintaining privacy and confidentiality where needed. Central banks exploring this include the Hong Kong Monetary Authority.
Cash money supply chain – The use of distributed ledger technology for issuing, tracking and managing the delivery and movement of cash from production facilities to the central bank and commercial bank branches; could include the ordering, depositing or movement of funds, and could simplify regulatory reporting. Central banks exploring this include the Eastern Caribbean Central Bank.
Customer SEPA Creditor Identifier (SCI) provisioning – Blockchain-based decentralized sharing repository for SEPA credit identifiers managed by the central bank and commercial banks in the SEPA debiting scheme. This is a faster, streamlined and decentralized system for identity provisioning and sharing. It can replace pre-existing manual and centralized processes that are time- and resource-intensive, as seen in the Bank of France’s Project MADRE implementation.
Emerging economies may benefit most: Cambodia, Thailand and South Africa and others experimenting
The National Bank of Cambodia will be one of the first countries to deploy blockchain technology in its national payments system for use by consumers and commercial banks. It is implementing blockchain technology in the second half of 2019 as an experiment to support financial inclusion and greater banking system efficiency.
The Bank of Thailand and the South African Reserve Bank, among others, are experimenting with CBDC in large-scale pilots for interbank payment and settlement efficiency. The Eastern Caribbean Central Bank is exploring the suitability of distributed ledger technology (DLT) to advance multiple goals, from financial inclusion and payments efficiency to payment system resilience against storms and hurricanes.
“Over the next four years, we should expect to see many central banks decide whether they will use blockchain and distributed ledger technologies to improve their processes and economic welfare,” Lannquist said. “Given the systemic importance of central bank processes, and the relative freshness of blockchain technology, banks must carefully consider all known and unknown risks to implementation.”
How Nuclear Techniques Help Feed China
With 19% of the world’s population but only 7% of its arable land, China is in a bind: how to feed its growing and increasingly affluent population while protecting its natural resources. The country’s agricultural scientists have made growing use of nuclear and isotopic techniques in crop production over the last decades. In cooperation with the IAEA and the Food and Agriculture Organization of the United Nations (FAO), they are now helping experts from Asia and beyond in the development of new crop varieties, using irradiation.
While in many countries, nuclear research in agriculture is carried out by nuclear agencies that work independently from the country’s agriculture research establishment, in China the use of nuclear techniques in agriculture is integrated into the work of the Chinese Academy of Agricultural Sciences (CAAS) and provincial academies of agricultural sciences. This ensures that the findings are put to use immediately.
And indeed, the second most widely used wheat mutant variety in China, Luyuan 502, was developed by CAAS’s Institute of Crop Sciences and the Institute of Shandong Academy of Agricultural Sciences, using space-induced mutation breeding (see Space-induced mutation breeding). It has a yield that is 11% higher than the traditional variety and is also more tolerant to drought and main diseases, said Luxiang Liu, Deputy Director General of the Institute. It has been planted on over 3.6 million hectares – almost as large as Switzerland. It is one of 11 wheat varieties developed for improved salt and drought tolerance, grain quality and yield, Mr Liu said.
Through close cooperation with the IAEA and FAO, China has released over 1,000 mutant crop varieties in the past 60 years, and varieties developed in China account for a fourth of mutants listed currently in the IAEA/FAO’s database of mutant varieties produced worldwide, said Sobhana Sivasankar, Head of the Plant Breeding and Genetics Section at the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture. The new mutation induction and high-throughput mutant selection approaches established at the Institute serve as a model to researchers from around the world, she added.
The Institute uses heavy ion beam accelerators, cosmic rays and gamma rays along with chemicals to induce mutations in a wide variety of crops, including wheat, rice, maize, soybean and vegetables. “Nuclear techniques are at the heart of our work, fully integrated into the development of plant varieties for the improvement of food security,” Liu said.
The Institute has also become a key contributor to the IAEA technical cooperation programme over the years: more than 150 plant breeders from over 30 countries have participated in training courses and benefited from fellowships at CAAS.
Indonesia’s nuclear agency, BATAN, and CAAS are looking for ways to collaborate on plant mutation breeding and Indonesian researchers are looking for ways to learn from China’s experience, said Totti Tjiptosumirat, Head of BATAN’s Center for Isotopes and Radiation Application. “Active dissemination and promotion of China’s activities in plant mutation breeding would benefit agricultural research across Asia,” he said.
From food safety to authenticity
Several of CAAS’ other institutes use nuclear-related and isotopic techniques in their research and development work and participate in several IAEA technical cooperation and coordinated research projects. The Institute of Quality Standards and Testing Technology for Agro-Products has developed a protocol to detect fake honey, using isotopic analysis. A large amount of what is sold in China as honey is estimated to be produced synthetically in labs rather than by bees in hives, so this has been an important tool in cracking down on fraudsters, said Professor Chen Gang, who leads the research work using isotopic techniques at the Institute. A programme is also in place to trace the geographical origin of beef using stable isotopes, he added.
The Institute uses isotopic techniques to test the safety and to verify the authenticity of milk and dairy products – work that was the outcome of IAEA technical coordinated research and cooperation projects that lasted from 2013 to 2018. “After a few years of support, we are now fully self-sufficient,” Mr Gang said.
Improving nutrition efficiency
Various CAAS institutes use stable isotopes to study the absorption, transfer and metabolism of nutrients in animals. The results are used to optimize feed composition and feeding schedules. Isotope tracing offers higher sensitivity than conventional analytical methods, and this is particularly advantageous when studying the absorption of micronutrients, vitamins, hormones and drugs, said Dengpan Bu, Professor at the Institute of Animal Science.
While China has perfected the use of many nuclear techniques, in several areas it is looking to the IAEA and the FAO for support: the country’s dairy industry is dogged by the low protein absorption rate of dairy cows. Less than half of the protein in animal feed is used by the ruminants, the rest ends up in their manure and urine. “This is wasteful for the farmer and the high nitrogen content in the manure hurts the environment,” Mr Bu said. The use of isotopes to trace nitrogen as it travels from feed through the animal’s body would help improve nitrogen efficiency by making the necessary adjustments to the composition of the feed. This will be particularly important as dairy consumption, currently at a third of global average per person, continues to rise. “We are looking for international expertise, through the IAEA and the FAO, to help us tackle this problem.”
When neuroscience meets AI: What does the future of learning look like?
Meet Dr. Nandini Chatterjee Singh, a cognitive neuroscientist at UNESCO MGIEP (Mahatma Gandhi Institute of Education for Peace and Sustainable Development) where she has been leading the development of a new framework for socio-emotional learning. MGIEP focuses on mainstreaming socio-emotional learning in education systems and innovating digital pedagogies.
Dr. Singh answered five questions on the convergence of neuroscience and Artificial Intelligence in learning, ahead of the International Congress on Cognitive Science in Schools where she will be speaking this week.
What are the links between neuroscience and Artificial Intelligence when it comes to learning?
The focus of both neuroscience and AI is to understand how the brain works and thus predict behaviour. And the better we understand the brain, the better designs we can create for AI algorithms. When it comes to learning, the neuroscience – AI partnership can be synergistic. A good understanding of a particular learning process by neuroscience can be used to inform the design of that process for AI. Similarly, if AI can find patterns from large data sets and get a learning model, neuroscience can conduct experiments to confirm it.
Secondly, when neuroscience provides learning behaviours to AI, these behaviours can be translated into digital interactions, which in turn are used by AI to look at learning patterns across large numbers of children worldwide. The power of AI is that it can scale this to large numbers. AI can track and search through massive amounts of data to see how that learning happens, and when required, identify when learning is different or goes off track.
A third feature is that of individualized learning. We increasingly also know that learning has a strong individual component. Yet our classrooms are structured to provide common learning to all children. Sometimes these individual differences become crucial to bring out the best in children, which is when we might tailor learning. Neuroscience research on individual differences has shown that detailed information on that individual can reveal a wealth of information about their learning patterns. However, this is extremely cost and labour intensive. Yet, this detailed learning from neuroscience can be provided to AI in order to scale. AI can collect extensive detailed data at the personal level, to design a path to learning for that child. Thus, what neuroscience can study in small groups, AI can implement in large populations. If we are to ensure a world where every child achieves full potential, such personalized learning offers a great promise.
How do we create a structure around AI to ensure learning standards globally?
One thing AI capitalizes on and constantly relies on is large volumes of data. AI algorithms perform better if they are being fed by continuous distributed data. We need to keep in mind that humans are the ones designing these algorithms. This means that the algorithms will only do as well as the data that they have been trained on. Ensuring that we have access to large amounts of data that comes from various situations of learning is crucial. What sometimes becomes an issue for AI algorithms is that most of the training data has been selected from one particular kind of population. This means that the diversity in the forms of learning is missing from the system.
To return to reading and literacy as an example, in neuroscience, a large part of our research and understanding of how the brain learns to read has come from individuals learning to read English and alphabetic languages. However, globally, billions of people speak or read non-alphabetic languages and scripts that are visually complex, which are not really reflected in this research. Our understanding is built on one particular system that does not have enough diversity.
Therefore, it is important that AI algorithms be tested in varied environments around the world where there are differences in culture. This will create more robust learning models that are able to meet diverse learning requirements and cater to every kind of learner from across the world. If we are able to do that, then we can predict what the learning trajectory will look like for children anywhere.
Human beings have similarities in the way they learn, but pedagogies vary across different situations. In addition, those differences must be reflected in the data provided. The results would be much more pertinent if we are able to capture and reflect those differences in the data. This will help us improve the learning of AI, and ultimately understand how the brain works. We would then be better suited to leverage the universal principles of learning that are being used across the world and effects that are cultural in nature. That is also something that we want to hold on to and capitalize on in trying to help children. People designing AI algorithms so far have not given a lot of attention to this, but they are now beginning to consider it in many places across the world.
How do you see AI’s role in inclusive education today, especially in the context of migration?
Societies have become multicultural in nature. If you go to a typical classroom in many countries, you will find children from diverse cultures sitting in the same learning space. Learning has to be able to meet a variety of needs and must become more inclusive and reflect cultural diversity. Innovative pedagogy such as games, interactive sessions and real-life situations are key because they test learning capabilities focused on skills that children should acquire. AI relies on digital interactions to understand learning and that comes from assessing skills and behaviours. We now recognize that what we need to empower our children with are skills and behaviours – not necessarily tons of information.
Digital pedagogies like interactive games are among the ones emerging rapidly to assess children’s skills. They are powerful because they can be used in multicultural environments and can assess different competencies. They are not necessarily tied to a specific language or curricula but are rather performance-based. How do you assess children for collaboration in a classroom? In the context of migration and 21st century skills, these are necessary abilities and digital games provide a medium to assess these in education. When such interactive games are played by children across the world, they provide digital interactions to AI. AI might discover new patterns and ways to collaborate since children have ways of doing things that are often out of the box. A skills-based approach can be applied anywhere, whether it is in a classroom in India, France or Kenya. In contrast, curriculum-based methods are context-specific and show extensive cultural variation.
What are the risks and the challenges?
Data protection and security is of course still a huge issue and is the biggest challenge in this sphere. We have to ensure that children are never at risk of exposure and that the data is not misused in any way. This is something that needs more global attention and backing.
Another crucial point is that learning assessments should not be restricted to just one domain. There are multiple ways, and time and space to learn. Learning is continuous in nature and should be able to be adapted to the child’s needs at that particular point. The assessment should also be continuous in order to get a full picture of the improvement that the child is demonstrating. If there is no improvement, then we can provide interventions to help and find out why learning is not happening. From what we know from neuroscience, the earlier you can provide intervention, the better is the chance of the child to be able to change and adapt. The ability of the brain to learn and change is much easier and faster in childhood compared to adulthood.
Yet, we want to be cautious about the conclusions we draw about how to intervene with children. Poor academic performance might have a social or emotional reason.
Thus, learning today needs to be multi-dimensional. Along with academic competencies, social and emotional skills also need to be assessed. If this information is used wisely, it can provide a lot of insight about the child’s academic and emotional well-being. Based on the combination of the two, the right intervention can be provided. Unless multiple assessments all converge on the same result, the child’s learning abilities should not be labeled. AI gives a great opportunity to conduct multi-skills assessments, rather than just one. And that is something that we should leverage, rather than abandon. The standards for the baselines for the algorithms must be properly taken into consideration for any type of assessment. They must come from a large quantity of distributed data in order to provide more accurate results. That is something that we should not compromise under any condition.
How is the teaching community responding to this new way of learning and assessing?
There are teachers who worry about the future of learning but that is also because they do not necessarily have the full picture. People working and promoting the use of AI in learning must play a crucial role in telling teachers that they will not be obsolete. Teachers will be more empowered and be able to meet the needs of every kind of learner in their classrooms. The ideal world would be to have one teacher per child but that is of course impossible. AI is a tool to guide teachers when it comes to finding the right intervention for a student that might be struggling to learn. That intervention comes from data that has been checked for bias and diversity and does not use ‘a one size fits all ‘approach and therefore teachers can be more certain that it will fit the needs of the child. AI gives the opportunity for the teacher to tailor learning for the child. In addition, we do not really know all the different kinds of learning. Sometimes we have to be prepared to learn from children themselves. Children can give us insights into the different ways that learning actually happens, and teachers should be able apply them back into the classroom. Teachers are extremely powerful individuals who are able to shape the brains of so many children. If they are doing a good job, they are making individuals for life.
Russia in Venezuela
Russia´s interests and presence in Latin America is not new. We should remember Russian activities in Cuba during the Cold...
EU plans to invest €9.2 billion in key digital technologies
The Digital Europe Programme is a new €9.2 billion funding programme whose goal is to ensure that all Europeans have...
The Luxury Collection Debuts in Cyprus With Parklane, A Luxury Collection Resort & Spa
The Luxury Collection®, part of Marriott International Inc., today announced the opening of Parklane, a Luxury Collection Resort & Spa,...
Economic reform in the Gulf: Who benefits, really?
For Gulf leaders, long-overdue economic reforms were never going to be easy. Leaders like the crown princes of Saudi Arabia...
China needs further reforms to make growth sustainable, greener and more inclusive
The Chinese economy continues to slow as it rebalances, with headwinds including trade frictions and the weakening global economy undermining...
ADB Releases Annual Report, Financial Results for 2018
The Asian Development Bank (ADB) released its Annual Report for 2018 today. The report presents ADB’s important operational and organizational...
The Rabidly Hypocritical EU
Unlike America under Donald Trump, who is proudly psychopathic and went so far as to blurt out that his followers...
Economy3 days ago
Back to IMF: Whither Pakistan’s Medina Model
Urban Development3 days ago
Smart housing prototype shows promise in rapidly urbanizing Africa
Europe2 days ago
Why Italy Needs to Enhance Its Strategic Vision
Reports2 days ago
Bhutan’s Economy to Moderately Grow in 2019 and 2020 on Strong Hydropower and Tourism Outlooks
Newsdesk2 days ago
Djibouti Signs ICSID Convention to Encourage Investment
Russia2 days ago
Eurasia’s Great Game and the Future of the China-Russia Alliance
Reports2 days ago
SMEs turning to alternative financing instruments as growth slows in bank lending
Reports3 days ago
Further reforms in Japan needed to meet the challenges of population ageing and high public debt