<tt id="6hsgl"><pre id="6hsgl"><pre id="6hsgl"></pre></pre></tt>
          <nav id="6hsgl"><th id="6hsgl"></th></nav>
          国产免费网站看v片元遮挡,一亚洲一区二区中文字幕,波多野结衣一区二区免费视频,天天色综网,久久综合给合久久狠狠狠,男人的天堂av一二三区,午夜福利看片在线观看,亚洲中文字幕在线无码一区二区
          USEUROPEAFRICAASIA 中文雙語Fran?ais
          Opinion
          Home / Opinion / Op-Ed Contributors

          Better manage risks inherent in Big Data

          By Ernest Davis | China Daily | Updated: 2017-02-13 08:00

          Better manage risks inherent in Big Data

          A man tries out a VR (virtual reality) device during the ongoing Big Data Expo 2016 in Guiyang, capital of Southwest China's Guizhou province, May 25, 2016. [Photo/Xinhua]

          In the last 15 years, we have witnessed an explosion in the amount of digital data available-from the Internet, social media, scientific equipment, smart phones, surveillance cameras, and many other sources-and in the computer technologies used to process it. "Big Data", as it is known, will undoubtedly deliver important scientific, technological, and medical advances. But Big Data also poses serious risks if it is misused or abused.

          But having more data is no substitute for having high-quality data. For example, a recent article in Nature reports that election pollsters in the United States are struggling to obtain representative samples of the population, because they are legally permitted to call only landline telephones, whereas Americans increasingly rely on cellphones. And while one can find countless political opinions on social media, these aren't reliably representative of voters, either. In fact, a substantial share of tweets and Facebook posts about politics are computer-generated.

          A Big Data program that used this search result to evaluate hiring and promotion decisions might penalize black candidates who resembled the pictures in the results for "unprofessional hairstyles," thereby perpetuating traditional social biases. And this isn't just a hypothetical possibility. Last year, a ProPublica investigation of "recidivism risk models" demonstrated that a widely used methodology to determine sentences for convicted criminals systematically overestimates the likelihood that black defendants will commit crimes in the future, and underestimates the risk that white defendants will do so.

          Another hazard of Big Data is that it can be gamed. When people know that a data set is being used to make important decisions that will affect them, they have an incentive to tip the scales in their favor. For example, teachers who are judged according to their students' test scores may be more likely to "teach to the test," or even to cheat.

          Similarly, college administrators who want to move their institutions up in the US News and World Reports rankings have made unwise decisions, such as investing in extravagant gyms at the expense of academics. Worse, they have made grotesquely unethical decisions, such as the effort by Mount Saint Mary's University to boost its "retention rate" by identifying and expelling weaker students in the first few weeks of school.

          A third hazard is privacy violations, because so much of the data now available contains personal information. In recent years, enormous collections of confidential data have been stolen from commercial and government sites; and researchers have shown how people's political opinions or even sexual preferences can be accurately gleaned from seemingly innocuous online postings, such as movie reviews-even when they are published pseudonymously.

          Finally, Big Data poses a challenge for accountability. Someone who feels that he or she has been treated unfairly by an algorithm's decision often has no way to appeal it, either because specific results cannot be interpreted, or because the people who have written the algorithm refuse to provide details about how it works. And while governments or corporations might intimidate anyone who objects by describing their algorithms as "mathematical" or "scientific," they, too, are often awed by their creations' behavior. The European Union recently adopted a measure guaranteeing people affected by algorithms a "right to an explanation"; but only time will tell how this will work in practice.

          When people who are harmed by Big Data have no avenues for recourse, the results can be toxic and far-reaching, as data scientist Cathy O'Neil demonstrates in her recent book Weapons of Math Destruction.

          The good news is that the hazards of Big Data can be largely avoided. But they won't be unless we zealously protect people's privacy, detect and correct unfairness, use algorithmic recommendations prudently, and maintain a rigorous understanding of algorithms' inner workings and the data that informs their decisions.

          The author is a professor of computer science at the Courant Institute of Mathematical Sciences, New York University.

          Project Syndicate

          Most Viewed in 24 Hours
          Copyright 1995 - . All rights reserved. The content (including but not limited to text, photo, multimedia information, etc) published in this site belongs to China Daily Information Co (CDIC). Without written authorization from CDIC, such content shall not be republished or used in any form. Note: Browsers with 1024*768 or higher resolution are suggested for this site.
          License for publishing multimedia online 0108263

          Registration Number: 130349
          FOLLOW US
          主站蜘蛛池模板: 久久久亚洲欧洲日产国码αv | 国产成人高清亚洲一区91| 99精品国产一区二区三区不卡| 日韩不卡一区二区在线观看| 亚洲欧美偷国产日韩| 东京热大乱系列无码| 中文成人无字幕乱码精品| 日韩精品中文字幕有码| 无码一区中文字幕| 98日韩精品人妻一二区| 宝贝腿开大点我添添公口述视频| 国产成人无码免费看视频软件 | 亚洲夂夂婷婷色拍ww47| 久热天堂在线视频精品伊人 | 国产福利2021最新在线观看| 久久青青草原精品国产app| 国产精品女同性一区二区| 日韩人妻少妇一区二区三区| 91中文字幕一区二区| 一区二区中文字幕视频| 亚洲人成电影网站色mp4| 日本道高清一区二区三区| 在线观看中文字幕国产码| 青青在线视频一区二区三区| 国产福利姬喷水福利在线观看| 啦啦啦高清在线观看视频www| 九九精品无码专区免费| 毛片免费观看天天干天天爽| 久久精品亚洲国产综合色| 亚洲一区中文字幕人妻| 婷婷六月色| 91国内精品久久久久影院| 亚洲成av人最新无码不卡短片| 国产成人高清精品亚洲| 亚洲一区二区三区四区三级视频 | 日本欧美大码a在线观看| 欧美日韩国产高清视频在线观看| 小罗莉极品一线天在线| 中文字幕在线观看一区二区| 99久久精品一区二区国产| 日本A级视频在线播放|