Editor’s Note: Surveillance in occupied territory now sits at the intersection of cyber operations, humanitarian law, human rights law, and data governance. This analysis, a companion piece to today’s earlier overview of the Cyber Law Toolkit’s September 2025 update, takes a closer look at Scenario 35, “Data collection in occupied territory,” and explains why it matters well beyond the battlefield.

For cybersecurity, data privacy, regulatory compliance, information governance, and eDiscovery professionals, the central lesson is practical: security-based data collection cannot rely on broad necessity claims alone. It must be legally grounded, narrowly tailored, proportionate, and defensible across both international humanitarian law and international human rights law. As organizations manage cross-border data flows, authority requests, sensitive personal information, and preservation duties in contested environments, the scenario offers a timely framework for identifying when surveillance and collection move from security measure to legal risk.


Content Assessment: Data collection in occupied territory: A closer read of Cyber Law Toolkit scenario 35

Information - 94%
Insight - 93%
Relevance - 92%
Objectivity - 93%
Authority - 94%

93%

Excellent

A short percentage-based assessment of the qualitative benefit expressed as a percentage of positive reception of the recent article from ComplexDiscovery OÜ titled, "Data collection in occupied territory: A closer read of Cyber Law Toolkit scenario 35."


Industry News – Cybersecurity Beat

Data collection in occupied territory: A closer read of Cyber Law Toolkit scenario 35

A companion to today’s overview: what the new scenario says, what it suggests, and where international humanitarian law and international human rights law leave the practitioner

ComplexDiscovery Staff

Today’s overview of the Cyber Law Toolkit’s September 2025 update profiled the project’s newest scenario — formally titled “Data collection in occupied territory” and numbered Scenario 35 — alongside the wider Toolkit’s history, partner consortium, and award recognition. What follows is the deeper read.

The scenario, authored by Tatjana Grote and peer-reviewed by Laurie Blank and Massimo Marelli, runs a single connected fact pattern through both international humanitarian law (IHL) and international human rights law (IHRL) and asks, paragraph by paragraph, whether each component of an Occupying Power’s conduct survives. The analysis is published commentary, not legal authority; what follows summarizes the scenario’s structure and the conclusions its analysis reaches.

The fact pattern

In the hypothetical scenario, an international armed conflict between two neighbors runs for several months and ends with State A holding part of State B’s territory in occupation. From there, State A puts three operations in motion.

First, traffic gets quietly rerouted. Internet activity from inside the occupied territory passes through servers run by a state-owned telecom in State A; once it does, every browsing history, message, and online call gets captured, screened, and warehoused.

Second, the army goes door-to-door. Soldiers patrolling the occupied area carry a phone-based reporting tool that feeds into a central database, with instructions to scoop up identifying information from as many residents as they can. The data feeds an algorithmic security score assigned to every inhabitant.

Third — and on what the analysis calls “an almost daily basis” — comes the field collection. Residents are stopped at checkpoints, their photographs are taken, and they’re asked to declare what they do for work, who their relatives are, what their political and religious commitments are, and what their sexual orientation is. The contact is so frequent and intrusive that residents in the most heavily patrolled neighborhoods cut back on going out and on family visits.

State A’s official line on all three operations is the same: “necessary security measures.” Insurgent activity has not, in fact, materially dropped, and the analysis records dissent from inside State A’s own security establishment about whether the program does any actual security work. A leaked internal note recasts the field collection in different terms entirely, describing it as a way of “making all of them pay the price for any insurgent activity.”

Both States are parties to all the relevant treaties: Hague IV (1907), the four Geneva Conventions (1949), Additional Protocol I (1977), and the International Covenant on Civil and Political Rights (ICCPR) (1976). Both also operate domestic data-protection regimes patterned on the EU’s General Data Protection Regulation, though the analysis flags that those regimes may not reach this conduct given the national-security carve-outs that are standard in such laws. The scenario’s “Examples” section links the hypothetical to several real-world cyber incidents in occupied or contested territory; the published page on cyberlaw.ccdcoe.org carries those direct citations.

What the IHL analysis concludes

The threshold question — whether the territory is occupied — is settled by Article 42 of the Hague Regulations: territory is occupied when actually placed under the authority of the hostile army, and that turns on effective control. The analysis observes that effective control is generally difficult to establish through cyber means alone, because occupation is inherently a territorial notion and the non-physical components of cyberspace cannot be “occupied.” The physical infrastructure on which cyberspace runs, however, can be.

Once the law of occupation applies, the analysis weighs State A’s conduct against three sources of obligation.

Article 43 HR — public order and civil life. The Occupying Power’s first obligation under the law of occupation is to ensure and maintain public order and civil life, and the in-person field collection cuts against that obligation. The analysis runs the tension as a proportionality test, balancing the civilian population’s interests against State A’s security interests. To clear the test, a security measure has to be the least intrusive option that can credibly achieve the legitimate aim. Collecting sensitive personal data with no clear security link — sexual orientation information is the analysis’s chosen example — fails on that ground. A blanket invocation of security authority does not carry the day. On Article 43 HR, the analysis lands on partial incompatibility.

Article 27 GC IV — respect for the person. Article 27 of the Geneva Convention IV is the humane treatment article. It obliges parties to a conflict to treat protected persons humanely and to respect a defined set of personal interests — the person, honor, family rights, religious practices, manners, and customs. Two readings of “respect for the person” compete in the analysis. Read broadly, the term picks up the right to private life. Read narrowly, it still protects moral integrity and the ability to live an ordinary social and family life. What looks like routine information-gathering — daily stops, photographs, questions about occupation, family, political activity, sexual orientation, religion — has measurable second-order effects, what the analysis calls “chilling effects.” In the modeled facts, those effects materialized: residents pulled back from public social activity and from visits to family.

There is a floor. Some Article 27 protections cannot be limited at all; the rest can be limited but not extinguished. The analysis treats that as a hard outer bound: an interference severe enough to gut the core of any Article 27 right is unlawful. Apply that to bulk access to private electronic communications under the broader “person” reading, and the conclusion is that State A’s online surveillance breaches Article 27 GC IV. Even the narrower reading still demands necessity and non-arbitrariness, and a mass indiscriminate program clears neither bar.

Article 33 GC IV — collective punishment. The collective-punishment prohibition catches a wider range of conduct than the word might suggest. “Punishment” here covers harassment and sanctions of any sort that strike a population rather than an individual offender. The in-person sweep, with its restrictions on movement and its routine intrusion into daily life, plausibly fits. The dispositive question is intent. The analysis offers three intent signals — whether the targets are picked without individualized suspicion, whether the measure’s duration and severity match a punitive purpose, and whether the conduct follows specific civilian acts in a way that looks retaliatory. State A’s program is wide-cast, open-ended, and tied directly to insurgent activity by the leaked “pay the price” language. On those signals, the analysis concludes that the in-person collection most likely falls under Article 33 of GC IV. It also notes the converse: a tightly bound, transparent collection regime would make the same intent finding hard to sustain.

The analysis closes the IHL section with a property note: collected data probably does not qualify as private property, intellectual property, or any sort of digital asset entitled to property-based protection from confiscation under the Hague Regulations.

The IHL conclusion: State A is most likely in violation of IHL. Data collection in occupied territory can, in principle, be a lawful security measure, but State A’s conduct goes beyond what is essential — both in who is targeted and in what is collected — and the indiscriminate plus punitive character makes the in-person collection a collective punishment in violation of Article 33 GC IV.

What the IHRL analysis concludes

The IHRL analysis hangs Article 17 of the ICCPR — and the parallel provisions in the Universal Declaration, the European Convention on Human Rights, and the American Convention on Human Rights — over the same fact pattern. Informational privacy is the core of the right at issue. Even straightforward data collection, the analysis points out, hits the right; secrecy and downstream processing aren’t required to make the collection a privacy interference.

The analysis confirms that the ICCPR applies extraterritorially in occupied territory, citing the UN Human Rights Committee’s General Comment No. 31 and the International Court of Justice’s positions in the Wall advisory opinion and DRC v Uganda. State A is therefore bound by its ICCPR obligations when acting in the occupied areas of State B.

The compatibility test has four parts. The interference must rest on a clear, accessible, and precise legal basis specifying when and how the interference may occur. It must pursue a legitimate aim recognized in the relevant treaty. It must be necessary and proportionate, with a least intrusive option floor. And the right’s untouchable essence must not be rendered meaningless.

State A’s program fails on the first prong. The legal-basis claim runs through Article 43 HR and Article 64 GC IV — but those provisions don’t actually spell out the conditions or procedures under which personal data can be collected, screened, or stored. The HRC’s General Comment No. 16 reads the legal-basis requirement to demand a case-by-case privacy interference test, which is structurally at odds with bulk programs. And on the necessity side, sexual orientation collection has no defensible link to the security purpose. The combination — no precise legal basis, no targeted authorization, no necessity link — pushes the interference into arbitrariness territory.

On the IHRL side, the conclusion lands in the same place. State A’s program isn’t compatible with the right to private life as the ICCPR frames it. Compliance would mean three changes in one: a precise, accessible legal foundation that says when data can be collected and how; a collection scope cut back to what is actually necessary for security; and a switch to the least intrusive method that can do the job. In practice, that’s the end of indiscriminate surveillance, and it’s the end of collecting personal data that bears no direct relationship to safety risks.

How IHL and IHRL interact

The most consequential analytical move sits at the back end of the analysis, where the scenario asks how IHL and IHRL slot together when both apply.

Both regimes are operative during armed conflict. Recent ICJ and ECtHR jurisprudence runs against the idea that IHL fully displaces IHRL — advisory practice from 2024 is the latest data point. The harder question is what happens when their obligations look like they conflict.

Two interpretive moves are on offer. Under the harmonious-reading move (the analysis labels it the complementary approach), the two regimes are read together rather than against each other; State A would carry both the IHL duties and the IHRL legal-basis duty, with proportionality cutting across the combined frame.

Under the alternative — lex specialis — the more focused rule edges out the more general one. The catch, the analysis flags, is that “more focused” isn’t self-defining. IHL has more to say about what an Occupying Power can do; IHRL has more to say about how surveillance and the processing of personal data should be handled. If you treat IHL as the lex specialis for occupation, IHRL’s legal-basis requirement could fall away — a conclusion that runs against the grain of recent rulings, including the European Court of Human Rights’ July 2025 judgment in Ukraine and the Netherlands v Russia.

On the harmonious reading, the analysis closes the loop. State A still has to set out a clear, transparent legal basis for any data processing. Proportionality runs across both regimes, pushing the indiscriminate collection out of bounds. Mass online interception could even reach the protected core of the right to private life — and at that point, the question loops back to Article 27 GC IV’s protection of the person.

The practitioner checklist

The Toolkit’s reference checklist is the practitioner takeaway. It walks legal advisers through both regimes.

Under IHL, advisers are asked to test: whether the territory is occupied; whether the data collection measures are the least injurious means available to ensure public order, civil life or the security of the Occupying Power; whether the measures interfere with protected persons’ person, manners or family ties in violation of Article 27 GC IV, including whether they entirely suspend any of those rights; whether the measures constitute inhumane treatment or expose protected persons to public curiosity; whether they are aimed at intimidating protected persons; whether they target entire groups of protected persons regardless of individual responsibility and are adopted with punitive intent; and whether the data in question constitutes private property.

Under IHRL, advisers are asked to test: whether the pertinent treaties are applicable; whether the data processing interferes with the right to private life; whether there is a clear, accessible and transparent legal basis specifying the circumstances and procedures for processing personal data; whether the purpose of processing is compatible with the pertinent treaties; whether the processing is the least intrusive means to achieve the purpose, both as to amount and type of data collected; whether there is a reasonable balance between the interference and the benefits of the legitimate purpose; and whether the measures impinge on the essence of the right to private life or any other human right.

What this suggests for cybersecurity, IG, and eDiscovery practitioners

The takeaways below are risk-management implications, not legal conclusions; specific decisions belong to qualified counsel applying the analysis to a given organization’s facts.

Three patterns emerge.

The least-injurious-means test runs through both regimes. Any cyber operation justified by security must survive a structured proportionality analysis: legitimate aim, narrow tailoring, and least intrusive available means. For organizations operating telecom or cloud infrastructure across contested geographies, the test informs which authority requests align with established legal grounds and which warrant escalation to outside counsel.

Indiscriminate collection has trouble passing either regime. The HRC’s case-by-case requirement and the IHL proportionality test point in the same direction: bulk collection of personal data without targeted suspicion is disproportionate under both bodies. Organizations being asked to provide data feeds to occupying or transitional authorities should treat indiscriminate scope as the strongest legal red flag.

Sensitive categories require their own justification. The scenario singles out sexual orientation collection as having no clear security link. Practitioners can apply the same logic by data category: each type of personal data collected must trace back to the security purpose. Where the connection is absent, the collection fails the necessity test under either regime.

For eDiscovery counsel preparing litigation holds in cross-border matters, the analysis’s IHL property note is a separate signal: collected personal data probably does not enjoy property-based protection under the Hague Regulations. That is a legal-strategy point for matters where data preservation across occupied territory is contested.

Where the scenario sits in the broader Toolkit

The scenario lists four sister entries practitioners pair with this one: cyber espionage against government departments (Scenario 02); sale of surveillance tools in defiance of international sanctions (Scenario 11); internet blockage (Scenario 24); and cyber disruption of humanitarian assistance (Scenario 25). Read as a set, those scenarios trace a continuous line from peacetime cyber espionage through wartime humanitarian protection — and the data-collection-in-occupied-territory scenario sits at the practical junction of surveillance, data collection, and the law of occupation.

The Toolkit’s bibliography for this scenario points to a clear scholarly cluster: Russell Buchan and Asaf Lubin’s edited volume The Rights to Privacy and Data Protection in Times of Armed Conflict (NATO CCDCOE Publications, 2022) is cited multiple times, alongside Eliav Lieblich and Eyal Benvenisti’s Occupation in International Law (Oxford University Press, 2023), Yoram Dinstein’s The International Law of Belligerent Occupation (Cambridge University Press, 2019), and academic literature on automated systems and biometrics in armed-conflict settings. For the practitioner who wants to go deeper, that bibliography is itself a reading list.

Closing observation

The scenario’s analytical move — asking the same fact pattern to survive both IHL and IHRL — is what makes it useful at the desk level. State A could, in principle, satisfy IHL by tightening proportionality, narrowing collection to suspicion-based targets, and removing categories with no security link. It would still have to clear the IHRL legal-basis hurdle, which IHL alone does not impose. Practitioners who treat the two regimes as a single integrated test — rather than as alternatives — track where the analysis ends up. Practitioners who lean only on IHL, or only on data-protection law, miss half the question.

What does your organization’s cross-border data-handling playbook look like when it has to survive both tests at once?

These are risk-management implications for discussion with qualified counsel, not legal advice.

News sources



Assisted by GAI and LLM technologies

Additional reading

Source: ComplexDiscovery OÜ

ComplexDiscovery’s mission is to enable clarity for complex decisions by providing independent, data‑driven reporting, research, and commentary that make digital risk, legal technology, and regulatory change more legible for practitioners, policymakers, and business leaders.

 

Have a Request?

If you have information or offering requests that you would like to ask us about, please let us know, and we will make our response to you a priority.

ComplexDiscovery OÜ is an independent digital publication and research organization based in Tallinn, Estonia. ComplexDiscovery covers cybersecurity, data privacy, regulatory compliance, and eDiscovery, with reporting that connects legal and business technology developments—including high-growth startup trends—to international business, policy, and global security dynamics. Focusing on technology and risk issues shaped by cross-border regulation and geopolitical complexity, ComplexDiscovery delivers editorial coverage, original analysis, and curated briefings for a global audience of legal, compliance, security, and technology professionals. Learn more at ComplexDiscovery.com.

 

Generative Artificial Intelligence and Large Language Model Use

ComplexDiscovery OÜ recognizes the value of GAI and LLM tools in streamlining content creation processes and enhancing the overall quality of its research, writing, and editing efforts. To this end, ComplexDiscovery OÜ regularly employs GAI tools, including ChatGPT, Claude, Gemini, Grammarly, Midjourney, and Perplexity, to assist, augment, and accelerate the development and publication of both new and revised content in posts and pages published (initiated in late 2022).

ComplexDiscovery also provides a ChatGPT-powered AI article assistant for its users. This feature leverages LLM capabilities to generate relevant and valuable insights related to specific page and post content published on ComplexDiscovery.com. By offering this AI-driven service, ComplexDiscovery OÜ aims to create a more interactive and engaging experience for its users, while highlighting the importance of responsible and ethical use of GAI and LLM technologies.