Independent Research Project

How Information Is Manipulated in the Digital Age

Documenting media influence, algorithmic shaping, narrative control, and psychological leverage — with primary sources.

Focus Evidence-Based Documentation
Posture Observatory, Not Activist
Standard Assertions Traceable to Sources
What We Document

Scope of Research

This project examines how information, media, and perception are shaped, distorted, and exploited in the modern digital environment. Our coverage is precise, measured, and non-partisan.

01

Media Framing

How narrative structures, language choices, and editorial decisions shape public understanding of events and issues.

02

Algorithmic Amplification

How platform systems determine visibility, reach, and the structural advantages certain content types receive.

03

Platform Incentives

Economic and behavioral structures that influence what gets created, shared, and consumed at scale.

04

Information Disorder

Distinguishing between misinformation, disinformation, and malinformation — and why these distinctions matter.

05

Behavioral Influence

Psychological techniques deployed at scale to shape perception, emotion, and decision-making.

06

Institutional Messaging

How governments, corporations, and organizations coordinate and deploy strategic communications.

Core Domains

Pillars of Analysis

Five interconnected areas form the foundation of our research. Each represents a distinct mechanism through which information is shaped and perception is influenced.

Media & Framing

Examining how editorial choices, story selection, and narrative construction shape understanding.

Explore

Platforms & Algorithms

How systems determine what billions of people see, and the structural biases embedded within.

Explore

Social Amplification

Network effects, viral dynamics, and how information spreads through human and automated networks.

Explore

Institutional Messaging

Coordinated communications from governments, corporations, and organizations.

Explore

Psychological Techniques

Cognitive biases, emotional triggers, and persuasion mechanisms leveraged at scale.

Explore
Evidence First

Case Studies

Documented examples of information manipulation with primary sources. Each case study follows a rigorous structure: what was claimed, what was amplified, what evidence shows.

Algorithmic Influence Research Archive

Platform Recommendation Systems and Radicalization Pathways

An examination of how recommendation algorithms create content pathways that progressively expose users to more extreme material, with analysis of platform internal research.

Internal Documents Peer-Reviewed Research Platform Data
Media Framing Research Archive

Headline vs. Body: Divergence Patterns in News Coverage

Systematic analysis of cases where article headlines conveyed significantly different implications than the supporting body text, and the engagement patterns this creates.

Content Analysis Engagement Metrics Primary Sources
Behavioral Influence Research Archive

Emotional Contagion Experiments and Platform Awareness

Documentation of large-scale psychological experiments conducted on platform users without informed consent, and subsequent policy responses.

Published Studies Regulatory Filings Expert Analysis
Information Disorder Research Archive

Coordinated Inauthentic Behavior: Patterns and Detection

Analysis of identified influence operations, their characteristics, reach, and the methodologies used to detect and attribute coordinated campaigns.

Platform Reports OSINT Analysis Government Disclosures
Systems, Not Villains

Mechanisms of Influence

Understanding how structural incentives and systemic design create conditions for manipulation — independent of individual intent.

01

Incentive Structures

How economic models reward engagement over accuracy, creating systemic pressure toward sensationalism.

02

Engagement Optimization

Algorithmic systems designed to maximize time-on-platform through emotional activation.

03

Sensationalism Loops

Feedback cycles where extreme content outperforms moderate content, shifting baseline expectations.

04

Attention Economics

The commodification of human attention and its implications for information quality.

05

Algorithmic Feedback

How personalization creates information environments that reinforce existing beliefs.

"

The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society.

Edward Bernays — Propaganda, 1928

Critical Distinctions

Information Disorder

Understanding the taxonomy of false and misleading information is essential for accurate analysis. Intent matters. Outcomes matter. These distinctions enable precision.

Misinformation

/ˌmɪsɪnfəˈmeɪʃ(ə)n/

False information shared without intent to deceive. The person spreading it believes it to be true. Harm occurs through ignorance, not malice.

Disinformation

/dɪsˌɪnfəˈmeɪʃ(ə)n/

Deliberately false information spread with intent to deceive. Created or disseminated with knowledge of its falsity. Strategic and purposeful.

Malinformation

/ˌmalɪnfəˈmeɪʃ(ə)n/

Genuine information shared with intent to cause harm. True facts weaponized through selective disclosure, decontextualization, or timing.

Standards

Research Methodology

Every assertion on this site is traceable to sources. We distinguish clearly between documented fact, informed analysis, and explicitly labeled opinion.

Our methodology prioritizes primary sources, peer-reviewed research, and verifiable documentation over secondary reporting or speculation.

Primary sources prioritized over secondary reporting

Clear distinction between evidence and analysis

Opinion explicitly labeled when present

Corrections published transparently

Source quality documented and assessed

No allegiance to ideology or institution

Living Documentation

Recent Research

This is active documentation, not static opinion. New research, platform policy changes, and notable information events are added continuously.