Covid-19 led to tech-driven solutions as technology was the only safe option to use during the pandemic. However, some of the methods that came up pose a lot of dangers to the common person. Contact tracing apps, for example, were developed to keep track of Covid-19 patients and their contacts. However, all solutions sometimes have their negatives. Technology, to be precise, played a vital role during the Covid-19 pandemic and is still playing the part. The following are dangers posed by some tech-driven solutions to Covid-19.
Digital Contact Tracing
Digital contact tracing was developed to identify persons who had met Covid-19 patients. When contact tracing is done incorrectly, it threatens privacy and can spread to other industries, such as law enforcement.
Many people believe that digital contact tracing can reduce the tight coordination between governments and tech companies’ privacy and human rights. To satisfy their own goals, government actors may (and do) misrepresent and corrupt public-health messages.
Automated policing and content restriction raises the possibility of a slide into authoritarianism. Governments and other authorities adopted automated tools designed initially for contact tracing to enforce quarantine measures and stay-at-home compliance.
This shows that the information acquired from these digital contact-tracing apps was shared with third parties without the user’s consent. This has led to the leaking of personal information to unknown people who have selfish ambitions rather than help combat the spread of Covid-19.
AI
AI systems use technological ideas, including cognitive computing, to generate statistical judgments about humans. All demographic data, preferences and potential future acts are covered.
To successfully address the unique needs of populations, automated systems help develop correlations based on massive amounts of data that accurately reflect information across identities. However, societal and cultural prejudices sometimes warp the data they employ. For several people, data may be unavailable, but it could be of poor quality for others, or it may symbolize societal inequities.
As a result, algorithms tend to contribute inaccurate projections while also reinforcing social prejudices and biases. Regrettably, most of the information gathered and recorded about COVID-19 is insufficient and skewed. COVID-19 infection rates, for example, have been underestimated by a magnitude of fifty or perhaps more.
In many cases, medical information merely represents a small minority of people: affluent, white people who have easy access to a restricted number of tests and expensive medical procedures. This made combating the pandemic tough.