I must preface that I am not a certified or self-trained expert in computer networking, the Internet, or Information-Technology (IT). The following views are mine and have been arrived at by listening to/reading up on the issue of net neutrality from partisan and non-partisan sources. Well-informed and fact-based views from experts on the subject are most welcome and highly sought.
The Trump administration placed net neutrality on the chopping block and AjitPai did the honors by repealing it. The issue created a large furor in the world of Internet and social media with divergent explanations floated by both sides.
Conservatives and right-wingers supported the repeal stating that the government shouldn’t impose itself on service providers and get to have a say in their operations. Folks on the left claimed that the Internet is no longer free and that loss of net-neutrality will usher in tiered tariffs and throttling/blocking of web content at the whim of the service providers (ISP).
It’s increasingly difficult to take a purely scientific approach towards technical issues in a culture where the pettiest things are used to smear the opposition and play partisan political games. With much effort, I have attempted to put aside politics and look merely into the nerdy details of this extremely obtuse concept of net-neutrality.
The premise of net neutrality hinges on the aphorism that the Internet/Web (* a nuanced, yet significant, distinction between the two will be discussed briefly later) is a public utility, hence, should be made available and accessible to everyone equally, just like electricity, cooking gas, and water. Corporations are profit-driven and heartless, as a result, the government should get involved in the markets and make sure that everyone gets these utilities and nobody is left in the lurch.
So, is the Internet a public utility?
The science of economics describes two characteristics for a service to qualify as public utility: non-excludability (people cannot be denied the product regardless of whether they have paid) and non-rivalry (consumption by one doesn’t reduce availability for others).
The Internet certainly doesn’t meet the non-excludability criterion, in that people who don’t pay for the service don’t get to use it. Major cities across the US have set up public Wi-Fi in a bid to provide Internet to all, but such “access-for-all” isn’t standard across the vast majority of the nation.
Thankfully, the Internet doesn’t fail to meet the non-rivalry criteria. A huge slug of new users might overwhelm existing service capabilities transiently, but additional hardware can be added to accommodate the growing demand. Thus, for all practical purposes, the Internet qualifies the non-rivalry criterion.
In summary, the Internet isn’t a public utility, at least not now.
But I would like to make additional depositions to make my case well-rounded and cogent.
The Internet was conceived in the 1960s as an effort on part of the US federal government to transfer data over foolproof communication networks run by computers. What started as a nascent and clunky project involving huge machines and laughable transfer speeds evolved into a means of global networking, telephony, and information transfer at incredibly fast speeds. This evolution was majorly spearheaded by researchers at several government agencies from different parts of the world. In the 1990s, the Internet was opened up to private players for commercial usage. Thus, the Internet has been built and developed using taxpayer money. Also, of note is that the Internet is a decentralized space that no one has hegemony over.
Now, over to the Web. Thrown around carelessly and interchangeably to describe the Internet, the Web is actually different from the Internet. The Web is an application developed by Sir Tim Berners-Lee, during his time at CERN – a multi-government funded organization – to access documents, pictures, videos and other files on the Internet that are marked in a distinguished manner. It’s one of the several ways to access stuff on the Internet and communicate with one another. By corollary, the Web was thus crafted by an individual using public’s (taxpayer)money. It’s this little, yet extremely important, corner of the Internet that this brouhaha is all about.
ISPs function as middlemen connecting end users to the Internet space, mainly through the World Wide Web or the Web or WWW. Neither did they create the Internet or the WWW, nor do they maintain it.
Effectively, private corporations are helping us access a digital space that was created using public’s money. Moreover, the creators of this space – whether it be governmental agencies or individuals – in all their largesse decided to open up the space for commercial use and allow people to freely (not to be conflated with ‘for free’) use the space.
Over the years, the Web has grown from an information archive and emailing medium to a source of employment, a means of starting and running a business, a tool to reach out to people across the world, a place to broadcast yourself and your work, and much more. While the Web doesn’t qualify as public utility, it does serve as one of the few ways by which people in first countries can augment the socioeconomic momentum of the Industrial Revolution using digital technology and by which people in third countries can change their destinies by creating an app, or by engaging in commerce across borders, or educating themselves for free.
Repealing net neutrality gives ISPs a kind of hegemony, not over the Web or the Internet, but over what we consume from this public-utility-hopeful. While larger corporations can find a way around by paying the large sums of money ISPs might demand for a certain degree of visibility on their respective services, it is almost difficult for an entrepreneur or a blogger or an independent journalist to pay the same sum for the same degree of visibility on those services.
“Take your business over to Facebook or on some other social media outlet and you won’t be discriminated against,” one might argue. Not quite true! Social media have tailored news feeds and show you what you have already seen. It will be difficult to market your business on fronts that are slowly devolving into echo chambers. Also, one cannot be certain that social media giants are unbiased in the way they deliver content, as has been the case with Facebook, which was accused of manipulating the ‘trending’ feature to suit their political leaning.
The gravity of the problem is further compounded when one factors in the regional monopolies that ISPs enjoy in the US. Competition is scarce because of the cost-intensive nature of running cables under the streets and setting up hardware. Overbuilders (ISPs using existing hardware and cables to provide an alternative) can increase competition, but financial feasibility and ROI of such ventures are pretty dim. In this regard, the Web certainly functions like a public utility and requires some sort of accountability on part of the ISP.
There is also a technical angle to the importance of net neutrality, which is lucidly explained here.
Repeal of net-neutrality should get everyone disconcerted, especially, small business owners, entrepreneurs, innovators, and the most vulnerable – alternative news media outlets, especially the ones with unsavory views – many of which tend to be on the political right. Cheering along to your own demise because your guy did it is the gold standard of intellectual indolence and buffoonery.
I would like to once again post face that I am not a certified or self-trained expert in matters of the Internet, computing, or networking and would welcome fact-based feedback on this subject.
Having said that, I can tell you two things with certainty: 1. Capitalize the first letter of Internet and Web and place the definite article the before these words when referencing them; and 2. We use the Web to get on the Internet to do stuff.
Central Banks Becoming Leaders in Blockchain Experimentation
Although central banks are among the most cautious institutions in the world, they are, perhaps surprisingly, among the first to implement and experiment with blockchain technology. Central banks have been quietly researching its possibilities since 2014. Over the past two years, the beginning of a new wave has emerged as more central banks launch large-scale pilots and research efforts, including rapid and complete cross-border interbank securities.
The Blockchain and Distributed Ledger Technology team at the World Economic Forum interviewed dozens of central bank researchers and analysed more than 60 reports on past and current research efforts. The findings were released today in a white paper, Central Banks and Distributed Ledger Technology: How are Central Banks Exploring Blockchain Today?
“As the blockchain hype cools, we are starting to see the real use cases for blockchain technology take the spotlight,” said Ashley Lannquist, Blockchain Project Lead at the World Economic Forum. “Central bank activities with blockchain and distributed ledger technology are not always well known or communicated. As a result, there is much speculation and misunderstanding about objectives and the state of research. Dozens of central banks around the world are actively investigating whether blockchain can help solve long-standing challenges such as banking and payments system efficiency, payments security and resilience, as well as financial inclusion.”
It is not widely known, for instance, that the Bank of France has fully replaced its centralized process for the provisioning and sharing of SEPA Credit Identifiers (SCIs) with a decentralized, blockchain-based solution. SEPA, or Single Euro Payments Area, is a payment scheme created by the European Union and managed on a country-by-country basis for facilitating efficient and secure cross-border retail debit and card payments across European countries. The solution is a private deployment of the Ethereum blockchain network and has been in use since December 2017. It has enabled greater time efficiency, process auditability and disaster recovery.
The fact that dozens of central banks are exploring, and in some cases implementing, blockchain technology is significant, according to the white paper. It is an early indicator of the potential use of this emerging technology across financial and monetary systems. “Central banks play one of the most critical roles in the global economy, and their decisions about implementing distributed ledger and digital currency technologies in the future can have far-reaching implications for economies,” Lannquist said.
Top 10 central bank use cases
Following interviews and analysis, how central banks are experimenting with blockchain can be highlighted by 10 top use cases.
Retail central bank digital currency (CBDC) – A substitute or complement for cash and an alternative to traditional bank deposits. A central-bank-issued digital currency can be operated and settled in a peer-to-peer and decentralized manner, widely available for consumer use. Central banks from several countries are experimenting, including those from the the Eastern Caribbean, Sweden, Uruguay, the Bahamas and Cambodia.
Wholesale central bank digital currency (CBDC) – This kind of digital currency would only be available for commercial banks and clearing houses to use the wholesale interbank market.Central bank-issued digital currency would be operated and settled in a peer-to-peer and decentralized manner. Central banks from several countries are experimenting, including those from South Africa, Canada, Japan, Thailand, Saudi Arabia, Singapore and Cambodia.
Interbank securities settlement – A focused application of blockchain technology, sometimes involving CBDC, enabling the rapid interbank clearing and settlement of securities for cash. This can achieve “delivery versus payment” interbank systems where two parties trading an asset, such as a security for cash, can conduct the payment for and delivery of the asset simultaneously. Central banks exploring this include the Bank of Japan, Monetary Authority of Singapore, Bank of England and Bank of Canada.
Payment system resiliency and contingency – The use of distributed ledger technology in a primary or back-up domestic interbank payment and settlement system to provide safety and continuity in case of threats, including technical or network failure, natural disaster, cybercrime and others. Often, this use case is coupled with others as part of the set of benefits that a distributed ledger technology implementation could potentially offer. Central banks exploring this include the Central Bank of Brazil and Eastern Caribbean Central Bank.
Bond issuance and lifecycle management – The use of distributed ledger technology in the bond auction, issuance or other life-cycle processes to reduce costs and increase efficiency. This may be applied to bonds issued and managed by sovereign states, international organizations or government agencies. Central banks or government regulators could be “observer nodes” to monitor activity where relevant. Early implementation is being conducted by the World Bank with their 2018 “bond-i” project.
Know-your-customer (KYC) and anti-money-laundering (AML) – Digital KYC/AML processes that leverage distributed ledger technology to track and share relevant customer payment and identity information to streamline processes. This may connect to a digital national identity platform or plug into pre-existing e-KYC or AML systems. Central banks exploring this include the Hong Kong Monetary Authority.
Information exchange and data sharing – The use of distributed or decentralized databases to create alternative systems for information and data sharing between or within related government or private sector institutions. Central banks exploring include the Central Bank of Brazil.
Trade finance – The employment of a decentralized database and functionality to enable faster, more efficient and more inclusive trade financing. Improves on today’s trade finance processes, which are often paper-based, labour-intensive and time-intensive. Customer information and transaction histories are shared between participants in the decentralized database while maintaining privacy and confidentiality where needed. Central banks exploring this include the Hong Kong Monetary Authority.
Cash money supply chain – The use of distributed ledger technology for issuing, tracking and managing the delivery and movement of cash from production facilities to the central bank and commercial bank branches; could include the ordering, depositing or movement of funds, and could simplify regulatory reporting. Central banks exploring this include the Eastern Caribbean Central Bank.
Customer SEPA Creditor Identifier (SCI) provisioning – Blockchain-based decentralized sharing repository for SEPA credit identifiers managed by the central bank and commercial banks in the SEPA debiting scheme. This is a faster, streamlined and decentralized system for identity provisioning and sharing. It can replace pre-existing manual and centralized processes that are time- and resource-intensive, as seen in the Bank of France’s Project MADRE implementation.
Emerging economies may benefit most: Cambodia, Thailand and South Africa and others experimenting
The National Bank of Cambodia will be one of the first countries to deploy blockchain technology in its national payments system for use by consumers and commercial banks. It is implementing blockchain technology in the second half of 2019 as an experiment to support financial inclusion and greater banking system efficiency.
The Bank of Thailand and the South African Reserve Bank, among others, are experimenting with CBDC in large-scale pilots for interbank payment and settlement efficiency. The Eastern Caribbean Central Bank is exploring the suitability of distributed ledger technology (DLT) to advance multiple goals, from financial inclusion and payments efficiency to payment system resilience against storms and hurricanes.
“Over the next four years, we should expect to see many central banks decide whether they will use blockchain and distributed ledger technologies to improve their processes and economic welfare,” Lannquist said. “Given the systemic importance of central bank processes, and the relative freshness of blockchain technology, banks must carefully consider all known and unknown risks to implementation.”
How Nuclear Techniques Help Feed China
With 19% of the world’s population but only 7% of its arable land, China is in a bind: how to feed its growing and increasingly affluent population while protecting its natural resources. The country’s agricultural scientists have made growing use of nuclear and isotopic techniques in crop production over the last decades. In cooperation with the IAEA and the Food and Agriculture Organization of the United Nations (FAO), they are now helping experts from Asia and beyond in the development of new crop varieties, using irradiation.
While in many countries, nuclear research in agriculture is carried out by nuclear agencies that work independently from the country’s agriculture research establishment, in China the use of nuclear techniques in agriculture is integrated into the work of the Chinese Academy of Agricultural Sciences (CAAS) and provincial academies of agricultural sciences. This ensures that the findings are put to use immediately.
And indeed, the second most widely used wheat mutant variety in China, Luyuan 502, was developed by CAAS’s Institute of Crop Sciences and the Institute of Shandong Academy of Agricultural Sciences, using space-induced mutation breeding (see Space-induced mutation breeding). It has a yield that is 11% higher than the traditional variety and is also more tolerant to drought and main diseases, said Luxiang Liu, Deputy Director General of the Institute. It has been planted on over 3.6 million hectares – almost as large as Switzerland. It is one of 11 wheat varieties developed for improved salt and drought tolerance, grain quality and yield, Mr Liu said.
Through close cooperation with the IAEA and FAO, China has released over 1,000 mutant crop varieties in the past 60 years, and varieties developed in China account for a fourth of mutants listed currently in the IAEA/FAO’s database of mutant varieties produced worldwide, said Sobhana Sivasankar, Head of the Plant Breeding and Genetics Section at the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture. The new mutation induction and high-throughput mutant selection approaches established at the Institute serve as a model to researchers from around the world, she added.
The Institute uses heavy ion beam accelerators, cosmic rays and gamma rays along with chemicals to induce mutations in a wide variety of crops, including wheat, rice, maize, soybean and vegetables. “Nuclear techniques are at the heart of our work, fully integrated into the development of plant varieties for the improvement of food security,” Liu said.
The Institute has also become a key contributor to the IAEA technical cooperation programme over the years: more than 150 plant breeders from over 30 countries have participated in training courses and benefited from fellowships at CAAS.
Indonesia’s nuclear agency, BATAN, and CAAS are looking for ways to collaborate on plant mutation breeding and Indonesian researchers are looking for ways to learn from China’s experience, said Totti Tjiptosumirat, Head of BATAN’s Center for Isotopes and Radiation Application. “Active dissemination and promotion of China’s activities in plant mutation breeding would benefit agricultural research across Asia,” he said.
From food safety to authenticity
Several of CAAS’ other institutes use nuclear-related and isotopic techniques in their research and development work and participate in several IAEA technical cooperation and coordinated research projects. The Institute of Quality Standards and Testing Technology for Agro-Products has developed a protocol to detect fake honey, using isotopic analysis. A large amount of what is sold in China as honey is estimated to be produced synthetically in labs rather than by bees in hives, so this has been an important tool in cracking down on fraudsters, said Professor Chen Gang, who leads the research work using isotopic techniques at the Institute. A programme is also in place to trace the geographical origin of beef using stable isotopes, he added.
The Institute uses isotopic techniques to test the safety and to verify the authenticity of milk and dairy products – work that was the outcome of IAEA technical coordinated research and cooperation projects that lasted from 2013 to 2018. “After a few years of support, we are now fully self-sufficient,” Mr Gang said.
Improving nutrition efficiency
Various CAAS institutes use stable isotopes to study the absorption, transfer and metabolism of nutrients in animals. The results are used to optimize feed composition and feeding schedules. Isotope tracing offers higher sensitivity than conventional analytical methods, and this is particularly advantageous when studying the absorption of micronutrients, vitamins, hormones and drugs, said Dengpan Bu, Professor at the Institute of Animal Science.
While China has perfected the use of many nuclear techniques, in several areas it is looking to the IAEA and the FAO for support: the country’s dairy industry is dogged by the low protein absorption rate of dairy cows. Less than half of the protein in animal feed is used by the ruminants, the rest ends up in their manure and urine. “This is wasteful for the farmer and the high nitrogen content in the manure hurts the environment,” Mr Bu said. The use of isotopes to trace nitrogen as it travels from feed through the animal’s body would help improve nitrogen efficiency by making the necessary adjustments to the composition of the feed. This will be particularly important as dairy consumption, currently at a third of global average per person, continues to rise. “We are looking for international expertise, through the IAEA and the FAO, to help us tackle this problem.”
When neuroscience meets AI: What does the future of learning look like?
Meet Dr. Nandini Chatterjee Singh, a cognitive neuroscientist at UNESCO MGIEP (Mahatma Gandhi Institute of Education for Peace and Sustainable Development) where she has been leading the development of a new framework for socio-emotional learning. MGIEP focuses on mainstreaming socio-emotional learning in education systems and innovating digital pedagogies.
Dr. Singh answered five questions on the convergence of neuroscience and Artificial Intelligence in learning, ahead of the International Congress on Cognitive Science in Schools where she will be speaking this week.
What are the links between neuroscience and Artificial Intelligence when it comes to learning?
The focus of both neuroscience and AI is to understand how the brain works and thus predict behaviour. And the better we understand the brain, the better designs we can create for AI algorithms. When it comes to learning, the neuroscience – AI partnership can be synergistic. A good understanding of a particular learning process by neuroscience can be used to inform the design of that process for AI. Similarly, if AI can find patterns from large data sets and get a learning model, neuroscience can conduct experiments to confirm it.
Secondly, when neuroscience provides learning behaviours to AI, these behaviours can be translated into digital interactions, which in turn are used by AI to look at learning patterns across large numbers of children worldwide. The power of AI is that it can scale this to large numbers. AI can track and search through massive amounts of data to see how that learning happens, and when required, identify when learning is different or goes off track.
A third feature is that of individualized learning. We increasingly also know that learning has a strong individual component. Yet our classrooms are structured to provide common learning to all children. Sometimes these individual differences become crucial to bring out the best in children, which is when we might tailor learning. Neuroscience research on individual differences has shown that detailed information on that individual can reveal a wealth of information about their learning patterns. However, this is extremely cost and labour intensive. Yet, this detailed learning from neuroscience can be provided to AI in order to scale. AI can collect extensive detailed data at the personal level, to design a path to learning for that child. Thus, what neuroscience can study in small groups, AI can implement in large populations. If we are to ensure a world where every child achieves full potential, such personalized learning offers a great promise.
How do we create a structure around AI to ensure learning standards globally?
One thing AI capitalizes on and constantly relies on is large volumes of data. AI algorithms perform better if they are being fed by continuous distributed data. We need to keep in mind that humans are the ones designing these algorithms. This means that the algorithms will only do as well as the data that they have been trained on. Ensuring that we have access to large amounts of data that comes from various situations of learning is crucial. What sometimes becomes an issue for AI algorithms is that most of the training data has been selected from one particular kind of population. This means that the diversity in the forms of learning is missing from the system.
To return to reading and literacy as an example, in neuroscience, a large part of our research and understanding of how the brain learns to read has come from individuals learning to read English and alphabetic languages. However, globally, billions of people speak or read non-alphabetic languages and scripts that are visually complex, which are not really reflected in this research. Our understanding is built on one particular system that does not have enough diversity.
Therefore, it is important that AI algorithms be tested in varied environments around the world where there are differences in culture. This will create more robust learning models that are able to meet diverse learning requirements and cater to every kind of learner from across the world. If we are able to do that, then we can predict what the learning trajectory will look like for children anywhere.
Human beings have similarities in the way they learn, but pedagogies vary across different situations. In addition, those differences must be reflected in the data provided. The results would be much more pertinent if we are able to capture and reflect those differences in the data. This will help us improve the learning of AI, and ultimately understand how the brain works. We would then be better suited to leverage the universal principles of learning that are being used across the world and effects that are cultural in nature. That is also something that we want to hold on to and capitalize on in trying to help children. People designing AI algorithms so far have not given a lot of attention to this, but they are now beginning to consider it in many places across the world.
How do you see AI’s role in inclusive education today, especially in the context of migration?
Societies have become multicultural in nature. If you go to a typical classroom in many countries, you will find children from diverse cultures sitting in the same learning space. Learning has to be able to meet a variety of needs and must become more inclusive and reflect cultural diversity. Innovative pedagogy such as games, interactive sessions and real-life situations are key because they test learning capabilities focused on skills that children should acquire. AI relies on digital interactions to understand learning and that comes from assessing skills and behaviours. We now recognize that what we need to empower our children with are skills and behaviours – not necessarily tons of information.
Digital pedagogies like interactive games are among the ones emerging rapidly to assess children’s skills. They are powerful because they can be used in multicultural environments and can assess different competencies. They are not necessarily tied to a specific language or curricula but are rather performance-based. How do you assess children for collaboration in a classroom? In the context of migration and 21st century skills, these are necessary abilities and digital games provide a medium to assess these in education. When such interactive games are played by children across the world, they provide digital interactions to AI. AI might discover new patterns and ways to collaborate since children have ways of doing things that are often out of the box. A skills-based approach can be applied anywhere, whether it is in a classroom in India, France or Kenya. In contrast, curriculum-based methods are context-specific and show extensive cultural variation.
What are the risks and the challenges?
Data protection and security is of course still a huge issue and is the biggest challenge in this sphere. We have to ensure that children are never at risk of exposure and that the data is not misused in any way. This is something that needs more global attention and backing.
Another crucial point is that learning assessments should not be restricted to just one domain. There are multiple ways, and time and space to learn. Learning is continuous in nature and should be able to be adapted to the child’s needs at that particular point. The assessment should also be continuous in order to get a full picture of the improvement that the child is demonstrating. If there is no improvement, then we can provide interventions to help and find out why learning is not happening. From what we know from neuroscience, the earlier you can provide intervention, the better is the chance of the child to be able to change and adapt. The ability of the brain to learn and change is much easier and faster in childhood compared to adulthood.
Yet, we want to be cautious about the conclusions we draw about how to intervene with children. Poor academic performance might have a social or emotional reason.
Thus, learning today needs to be multi-dimensional. Along with academic competencies, social and emotional skills also need to be assessed. If this information is used wisely, it can provide a lot of insight about the child’s academic and emotional well-being. Based on the combination of the two, the right intervention can be provided. Unless multiple assessments all converge on the same result, the child’s learning abilities should not be labeled. AI gives a great opportunity to conduct multi-skills assessments, rather than just one. And that is something that we should leverage, rather than abandon. The standards for the baselines for the algorithms must be properly taken into consideration for any type of assessment. They must come from a large quantity of distributed data in order to provide more accurate results. That is something that we should not compromise under any condition.
How is the teaching community responding to this new way of learning and assessing?
There are teachers who worry about the future of learning but that is also because they do not necessarily have the full picture. People working and promoting the use of AI in learning must play a crucial role in telling teachers that they will not be obsolete. Teachers will be more empowered and be able to meet the needs of every kind of learner in their classrooms. The ideal world would be to have one teacher per child but that is of course impossible. AI is a tool to guide teachers when it comes to finding the right intervention for a student that might be struggling to learn. That intervention comes from data that has been checked for bias and diversity and does not use ‘a one size fits all ‘approach and therefore teachers can be more certain that it will fit the needs of the child. AI gives the opportunity for the teacher to tailor learning for the child. In addition, we do not really know all the different kinds of learning. Sometimes we have to be prepared to learn from children themselves. Children can give us insights into the different ways that learning actually happens, and teachers should be able apply them back into the classroom. Teachers are extremely powerful individuals who are able to shape the brains of so many children. If they are doing a good job, they are making individuals for life.
Uzbek’s Katibat al Tawhid wal Jihad changed its leader
On April 12, 2019, Central Asia’s Salafi-Jihadist group Katibat al Tawhid wal Jihad (KTJ) issued a statement on the Telegram...
Highlights from the Mueller Report
Following are the passages that I consider to be the chief and most important allegations that are in the opening...
Structural Transformation Can Turn Cities into Engines of Prosperity
By 2050, the population of cities is projected to double from its current size, with nearly 70 percent of the...
Battling for the Future: Arab Protests 2.0
Momentous developments across Arab North and East Africa suggest the long-drawn-out process of political transition in the region as well...
As Marsha Lazareva languishes in jail, foreign businesses will “think twice” before investing in Kuwait
IF THERE IS one thing to glean from the case of Marsha Lazareva, it’s that foreign businesses must now think...
IEA hosts high-level meeting on Africa’s energy outlook
The International Energy Agency held a day-long workshop on Wednesday to discuss ways to promote greater energy development across the...
Trump’s coming trade war “deal” is a dud
In typically bullish style, Donald Trump has told the world he expects a resolution to his trade war with China...
Russia2 days ago
The Results of the Azerbaijan- Russia Industrial Cooperation Forum
Defense3 days ago
India’s Space Ambitions
South Asia2 days ago
Pakistan: A Terrorized Rather than Terrorist State
Terrorism1 day ago
Post-Pulwama False Flag Operation: Prediction and Reality
Middle East2 days ago
Economic reform in the Gulf: Who benefits, really?
Reports2 days ago
China needs further reforms to make growth sustainable, greener and more inclusive
Europe2 days ago
The Rabidly Hypocritical EU
Hotels & Resorts2 days ago
The Luxury Collection Debuts in Cyprus With Parklane, A Luxury Collection Resort & Spa