How US Businesses Are Embracing Sustainable Practices

The intelligent processing of data in IoT utilizing machine learning and artificial intelligence, such as deep neural networks, makes it challenging to explain why predictions turn out the way they do. However, accounting for these processes cannot be accomplished solely by having access to the algorithms (Ananny and Crawford 2016), but rather by continuously storing data processing outcomes and making that information available for independent audits, for example (Kroll et al. 2016; Bechmann and Bowker 2017). However, one of the most significant challenges in such audits is the issue of time: once an audit is completed, the algorithm may have altered or picked up new patterns in the closed machine learning cycle, making the forecast different and perhaps violating some of the values evaluated for.  

Transparency. 


Accountability, to some extent, combines the goals of transparency and governance, not by claiming direct transparency for the code in the algorithms, but by focusing on how to ensure that companies and organizations can document and account for how they meet criteria such as anti-discrimination, as well as showing what has been manipulated, for whom, and how (e.g., content in political campaigns) without violating intellectual property rights. Supplementary accountability also illustrates how players such as firms and organizations comply with the general regulations and values of a specific market, culture. 
Transparency has been a popular topic in recent data, machine learning, and IoT policy studies. Transparency is frequently used as a notion to comply with privacy requirements (for example, the General Data Privacy Regulation in the EU) and to provide insight into where data is being used, for what purposes, by whom, and for how long. This is frequently regarded as a component of privacy-by-design compliance, but the transparency paradox may prevent transparency from being a solution for such complex network accounts, simply because users will be unable to understand the complexity, will not read it due to the complexity, and will continue to use the service without further consultation (Nissenbaum 2011). 

These difficulties are not reducing with IoT, but rather increasing as interfaces become more physical and interwoven.


 Some researchers believe that transparency in algorithms and code helps ensure that corporations meet broad human rights rules as well as specialized regulations inside distinct marketplaces (Sandvig et al. 2014). Other scholars, however, have suggested that transparency of the algorithm will not solve the problem of governance because algorithms are intertwined and often work autonomously, implying that the outcome on values can only be measured when the algorithm meets the data (Bechmann and Bowker 2017; Kroll et al. 2016; Ananny and Crawford 2016). 
Transparency is one of two fundamental concepts in ethics, the other being privacy. To some extent, these two concerns may be mutually exclusive and do not necessarily function together. This tension is exemplified by the propagation and combating of disinformation. To prevent disinformation, data scientists and regulators must comprehend its spread at scale; nevertheless, by allowing access to information exposure, authorities will violate the privacy of the networks that shared that information (EU Commission 2018). However, in view of future data demands, privacy has been elevated to a high priority in EU governance with the implementation of the General Data Protection Regulation. However, privacy is more than just a way to comply with regulations; it is also a way to communicate corporate social responsibility to the user, which has the potential to increase the trust relationship between company and user on which IoT increasingly relies in order to attract and retain users despite the creep factor and fear scenarios, as exemplified in the sections above.

The Currency of Trust: Challenges and Opportunities for IoT.


Trust is a notion that describes an unequal power dynamic between two or more actors, in this case the corporation and the user(s), with one actor having more knowledge than the other (Giddens 2013). Trust thus becomes the important currency in IoT, and it must be included into future new IoT solutions to ensure a long-term and trustworthy relationship with consumers. How are businesses going to prevent attacks on trust, as we've seen with terror scenarios and meticulously planned disinformation campaigns? More study is needed to tackle this critical subject in depth and at various levels, including interface design, infrastructure, and protocols. This chapter provides an overview of three interconnected disciplines important to the development of IoT. It is evident that we are still in the early stages of providing tools and understanding in terms of technology, business, and ethics for developing IoT applications and services, which will drive creative business models. Technology might be viewed as driving innovation, while business and ethics discussions lag behind. The technical problem focuses on how to engage with business and ethical challenges to develop solutions that are interoperable, safe, and efficient while also providing the appropriate amount of privacy mechanisms for ethical IoT use. Another aspect is the business issue, which involves modeling a company's activities in a digital environment using tools that are agnostic to novel technologies and fail to fully realize the potential of IoT. A mismatch between understanding and drivers is emerging, and we are seeing attempts to bridge the gap, as demonstrated by Vermesan et al. (2016) and Mansour et al. (2018). The fourth part of the triangle, IoT and ethics, provides insight into the exploitation of technology that reveals immaturity.

Comments

Popular posts from this blog

How Market Research Analysts Contribute to Brand Development

The Role of Marketing Operations Specialists in SaaS Businesses

A Day in the Life of a Market Research Analyst in the USA