3

Undercurrents: Episode 6 - Tribes of Europe, and the International Women's Rights Agenda at the UN




3

Planning for Africa's Future: Youth Perspectives from Kenya and South Africa




3

Undercurrents: Episode 7 - Libya's War Economy, and Is the United Nations Still Relevant?




3

Equality by 2030: The Press for Progress




3

Undercurrents: Episode 9 - Digital Subversion in Cyberspace, and Oleg Sentsov's Hunger Strike




3

Japan's Pivot in Asia




3

Undercurrents: Episode 12 - Trump's Visit to the UK, and Japanese Foreign Policy in Asia




3

Undercurrents: Episode 13 - India's Billionaires, and Sexual Exploitation in the UN




3

A View From the Élysée: France’s Role in the World




3

Red Flags: The Outlook for Xi Jinping's China




3

Undercurrents: Episode 18 - The American Dream vs America First, and Uganda's Illegal Ivory Trade




3

Undercurrents: Episode 19 - Green Building Projects in Jordan, and Qatar's Football World Cup




3

Undercurrents: Episode 20 - #MeToo and the Power of Women's Anger




3

Undercurrents: Episode 22 - China's Belt and Road Initiative, and the Rise of National Populism




3

Undercurrents: Episode 23 - Robin Niblett on the Future of Think-Tanks




3

A Divided Island: Sri Lanka's Constitutional Crisis




3

Une Nouvelle Révolution? Macron and the Gilets Jaunes




3

Undercurrents: Episode 26 - China's Economy, and UK Relations with Saudi Arabia




3

Iran's Revolution at 40




3

Undercurrents: Episode 30 - The Crisis in Kashmir, and How to Regulate Big Tech




3

Undercurrents: Episode 31 - Re-imagining the Global Food System




3

Ukraine's Unpredictable Presidential Elections




3

Undercurrents: Episode 32 - Protecting Health Workers in Conflict




3

Undercurrents: Episode 33 - Chinese Millennials, and Attacks on Infrastructure in Gaza




3

Undercurrents: Episode 34 - Protecting Children in Conflict




3

Undercurrents: Episode 35 - EU Elections, and Sustainable Development in Colombia




3

Undercurrents: Episode 36 - The Online World of British Muslims




3

Undercurrents: Episode 37 - Women in Leadership, and Europe's Ageing Population




3

Undercurrents: Summer Special - Andrés Rozental on Mexican Politics




3

Saudi Arabia's Foreign Policy Priorities




3

Understanding South Africa's Political Landscape




3

Rethinking 'The Economic Consequences of the Peace'




3

Undercurrents: Episode 43 - The UK Election, and Svyatoslav Vakarchuk on the Future of Ukraine




3

Angola's Business Promise: Evaluating the Progress of Privatization and Other Economic Reforms




3

Secularism, Nationalism and India's Constitution




3

Undercurrents: Episode 47 - Pakistan's Blasphemy Laws




3

The Climate Briefing: Episode 3 - Climate Change and National Security




3

Undercurrents: Episode 51 - Preparing for Pandemics, and Gandhi's Chatham House Speech




3

Undercurrents: Episode 52 - Defining Pandemics, and Mikheil Saakashvili's Ukrainian Comeback




3

Undercurrents: Episode 53 - Protecting Workers During COVID-19, and Food in Security in West Africa




3

Undercurrents: Episode 54 - India's COVID-19 Tracing App, and the Media's Pandemic Response




3

Undercurrents: Episode 55 - Benjamin Netanyahu's Trial, and the Identity Politics of Eurovision




3

Undercurrents: Episode 56 - Uganda's Children Born of War




3

Undercurrents: Episode 61 - LGBTQ+ Rights, and China's Post-COVID Global Standing




3

Undercurrents: Episode 63 - The Politics of Violent Images




3

Normal high density lipoprotein inhibits three steps in the formation of mildly oxidized low density lipoprotein: steps 2 and 3

Mohamad Navab
Sep 1, 2000; 41:1495-1508
Articles




3

The UK's new Online Safety Bill

The UK's new Online Safety Bill 10 February 2021 — 3:00PM TO 3:45PM Anonymous (not verified) 26 January 2021 Online

Discussing the new proposals which include the establishment of a new ‘duty of care’ on companies to ensure they have robust systems in place to keep their users safe.

Governments, regulators and tech companies are currently grappling with the challenge of how to promote an open and vibrant internet at the same time as tackling harmful activity online, including the spread of hateful content, terrorist propaganda, and the conduct of cyberbullying, child sexual exploitation and abuse.

The UK government’s Online Harms proposals include the establishment of a new ‘duty of care’ on companies to ensure they have robust systems in place to keep their users safe. Compliance with this new duty will be overseen by an independent regulator.

On 15 December 2020, DCMS and the Home Office published the full UK government response, setting out the intended policy positions for the regulatory framework, and confirming Ofcom as the regulator.

With the legislation likely to be introduced early this year, the panel will discuss questions including:

  • How to strike the balance between freedom of expression and protecting adults from harmful material?

  • How to ensure the legislation’s approach to harm is sufficiently future-proofed so new trends and harms are covered as they emerge?

  • What additional responsibilities will tech companies have under the new regulation?

  • Will the regulator have sufficient powers to tackle the wide range of harms in question?

This event is invite-only for participants, but you can watch the livestream of the discussion on this page at 15.00 GMT on Wednesday 10 February.




3

Facebook's power under scrutiny as Trump ban upheld

Facebook's power under scrutiny as Trump ban upheld Expert comment NCapeling 6 May 2021

Keeping Donald Trump’s Facebook ban in place shows the vast power social media platforms hold, raising questions of whether that power is appropriately used.

Kate Jones

From a human rights perspective, the Oversight Board’s decision is a strong one, and not at all surprising. The board decided Facebook was right to suspend the former president’s access to post content on Facebook and Instagram, but not indefinitely.

It found Donald Trump’s posts violated Facebook’s community standards because they amounted to praise or support of people engaged in violence and that, applying a human rights assessment, Facebook’s suspension of Trump was a necessary and proportionate restriction of his right to freedom of expression.

It is in content amplification, not just content moderation, that Facebook should face scrutiny and accountability for the sake of the human rights of its users

However the board also found Trump’s indefinite suspension was neither in conformity with a clear Facebook procedure nor consistent with its commitment to respect human rights. Its decision requires Facebook to make a new decision on the future of Donald Trump’s account, grounded in its rules.

While opinions on this result will differ, the increased call for clear and accessible rules and respect for human rights in their implementation that the Oversight Board brings to Facebook’s operations is welcome.

But the Oversight Board’s powers are limited to content moderation – Facebook declined to answer the board’s questions about amplification of Trump’s posts through the platform’s design decisions and algorithms. This limitation on the board’s role should be lifted. It is in content amplification, not just content moderation, that Facebook should face scrutiny and accountability for the sake of the human rights of its users.

Fundamentally, human rights is not a veneer which can mask or legitimize underlying power dynamics or public policy – those still fall to be assessed for themselves.

The Trump/Facebook saga does highlight the vast power Facebook and other major social media platforms have over political discussion and persuasion. Through granting or denying, or through amplifying or quietening the voices of political figures, Facebook has the power to shape politics, electorates, and democratic processes. Improving content moderation through the Oversight Board, although important, does little to constrain that power.

Facebook itself, unlike a government, has no accountability to the general public, and the Oversight Board must not distract us from the need for a full conversation about the extent to which Facebook’s power is appropriately held and properly wielded.

Emily Taylor

This decision marks a coming of age for Facebook’s content moderation process. For years, decisions to take down content or ban users have been opaque, conducted by a human workforce that Facebook and other platforms have been hesitant to acknowledge. The platforms have also been worried that being seen to exercise an editorial function might put at risk the legal protections which prevent the platforms being held responsible for user-generated content.

When the Oversight Board was first posited, observers questioned whether a body funded by Facebook could properly exercise a legitimate appeals function. Now there is a reasoned decision which partly supports the decision to de-platform a serving president, but also takes issue with the indefinite nature of the ban.

If the process is to gain respect as a truly independent oversight on the platform’s decisions, greater transparency over the identity of decision-makers will be needed

Facebook specifically asked the Oversight Board to consider specific challenges involved when the person involved is a political leader. The board concluded that Trump’s ‘status as head of state with a high position of trust not only imbued his words with greater force and credibility but also created risks that his followers would understand they could act with impunity’. The storming of the US Capitol and role President Trump played in stirring up the violence underlined that political leaders’ words can motivate others to take harmful actions.

Just as the events of January 6 remain shocking, it remains shocking that private platforms have exercised the power to curb the speech of a US president. It also remains shocking that the platforms sat back and took no action over the previous four years, but waited until the final days of the transition.

The board’s decision is an evolution in private-sector content moderation, with a diverse board giving a reasoned opinion on a Facebook decision. But to fully comply with the principles of open justice, board decisions should include more detail on the individuals who have made the decision – at present, it appears all members of the board review the decision but it is not clear which individuals were involved in its drafting, or that they were clear from conflicts. If the process is to gain respect as a truly independent oversight on the platform’s decisions, greater transparency over the identity of decision-makers will be needed.

Mark Zuckerberg expressed concern about Facebook becoming an arbiter of truth or free speech and, overall, the difficulty of having private companies managing the application of fundamental rights on their platforms has not been solved. Just because companies have the financial resources to do it, does not mean they necessarily should.

Yet no other international governance or arbitration system has emerged to handle the complexities of platform power over speech. In the context of that vacuum, the Oversight Board’s decision is a welcome step.




3

Undercurrents: The Oversight Board's Trump decision, and Merkel's legacy

Undercurrents: The Oversight Board's Trump decision, and Merkel's legacy Audio bhorton.drupal 25 June 2021

Was Facebook right to suspend Trump? And how will Merkel be remembered?

In the wake of the storming of Capitol Hill on 6 January 2021, social media platforms took steps to remove former President Donald Trump from their websites for infringing community standards. This step was welcomed by many, but also raised serious questions about the power of social media companies to limit free speech and censor elected officials. The suspension of President Trump from Facebook was referred to the Oversight Board, an independent body of experts set up to scrutinise the platform’s content moderation decisions.  

In this episode, Ben speaks to Thomas Hughes and Kate Jones about the outcome of the Oversight Board’s inquiry into the Trump suspension, and the wider implications for content moderation on social media.  

Then Lara is joined by Hans Kundnani to assess the political outlook in Germany and reflect on the legacy of outgoing Chancellor Angela Merkel.  




3

The Arg-293 of Cryptochrome1 is responsible for the allosteric regulation of CLOCK-CRY1 binding in circadian rhythm [Computational Biology]

Mammalian circadian clocks are driven by transcription/translation feedback loops composed of positive transcriptional activators (BMAL1 and CLOCK) and negative repressors (CRYPTOCHROMEs (CRYs) and PERIODs (PERs)). CRYs, in complex with PERs, bind to the BMAL1/CLOCK complex and repress E-box–driven transcription of clock-associated genes. There are two individual CRYs, with CRY1 exhibiting higher affinity to the BMAL1/CLOCK complex than CRY2. It is known that this differential binding is regulated by a dynamic serine-rich loop adjacent to the secondary pocket of both CRYs, but the underlying features controlling loop dynamics are not known. Here we report that allosteric regulation of the serine-rich loop is mediated by Arg-293 of CRY1, identified as a rare CRY1 SNP in the Ensembl and 1000 Genomes databases. The p.Arg293His CRY1 variant caused a shortened circadian period in a Cry1−/−Cry2−/− double knockout mouse embryonic fibroblast cell line. Moreover, the variant displayed reduced repressor activity on BMAL1/CLOCK driven transcription, which is explained by reduced affinity to BMAL1/CLOCK in the absence of PER2 compared with CRY1. Molecular dynamics simulations revealed that the p.Arg293His CRY1 variant altered a communication pathway between Arg-293 and the serine loop by reducing its dynamicity. Collectively, this study provides direct evidence that allosterism in CRY1 is critical for the regulation of circadian rhythm.