Profile cover photo
Profile photo
Eduardo Suastegui
Story-teller #indieauthor of character-driven, fast-flowing #thrillers that stir the heart.
Story-teller #indieauthor of character-driven, fast-flowing #thrillers that stir the heart.
Eduardo Suastegui's posts

Post is pinned.Post has attachment
Visit my website and check out the 3 series I've written... a 3rd 4th! in the works. #amwriting

While there, join my Reader's Club to get #free eBook downloads, and to stay up on what's coming next.

Post has attachment
A #wordoftheday to describe most of my lunches. 

Post has attachment
What is the future of the Air Force? #Drones, directed-energy weapons and the future of war. @GeorgeWill

Post has attachment

Post has attachment
How #HealthCare (among other things) became a right. Hint: thank FDR's logic.

Post has attachment
Elon Musk Says He Has 'Verbal' OK To Build N.Y.-D.C. #Hyperloop

Post has attachment
DOJ takes down Dark Web market AlphaBay, founder dead after arrest #cybercrime

"The Department of Justice and its international partners announced Thursday a takedown of a massive Dark Web marketplace that was allegedly one of the world's biggest sources for the sale of drugs and illicit materials, striking a blow to the cybercriminal underground.

"In shutting down AlphaBay and arresting its alleged founder, Alexandre Cazes, the Justice Department also is going after the assets of Cazes, who is charged with running the sophisticated anonymous market and was worth upwards of $23 million, according to court documents. Cazes apparently took his life on July 12 while in custody in Thailand after authorities arrested him there a week prior, the DOJ said..."

Post has attachment

Post has shared content
Might it be that human #bias is a byproduct of mathematical averaging based on sampling of human behavior?

On the Occam's razor side, garbage-in, garbage-out (GIGO). Algorithms are simply implementing the human thinking (and biases) of their creators.
Technology Is Biased Too. How Do We Fix It? | FiveThirtyEight

'... Rather than relying on human judgment alone, organizations are increasingly asking algorithms to weigh in on questions that have profound social ramifications, like whether to recruit someone for a job, give them a loan, identify them as a suspect in a crime, send them to prison or grant them parole.

But an increasing body of research and criticism suggests that algorithms and artificial intelligence aren’t necessarily a panacea for ending prejudice, and they can have disproportionate impacts on groups that are already socially disadvantaged, particularly people of color. Instead of offering a workaround for human biases, the tools we designed to help us predict the future may be dooming us to repeat the past by replicating and even amplifying societal inequalities that already exist.

These data-fueled predictive technologies aren’t going away anytime soon. So how can we address the potential for discrimination in incredibly complex tools that have already quietly embedded themselves in our lives and in some of the most powerful institutions in the country?

Post has attachment
Wait while more posts are being loaded