From Containers to Chaos: Why Freight Data Never Got Standardised

Modern freight began with a simple idea. Standardise the box.



On 26 April 1956, the SS Ideal X sailed from Newark to Houston carrying 58 containers. Transport Geography, a long-standing academic resource on global transport systems, identifies this voyage as the starting point of modern containerised shipping.


It marked a decisive shift away from break-bulk cargo toward repeatable, modular units that could move efficiently through global logistics networks.


Just over a decade later, that idea was formalised. In 1968, the International Organization for Standardization introduced standards for Series 1 freight containers.


ISO documentation, supported by publications from the American National Standards Institute, shows how shared dimensions and handling requirements allowed containers to move seamlessly between ships, trucks, and rail across international supply chains.


The physical supply chain scaled because the industry agreed on what a container was.


Freight data never followed the same path.


Freight data standards exist, but there is no single foundation


It would be incorrect to say freight data has no standards. The reality is more complex. Logistics data standards emerged gradually, across different transport modes, and under varying commercial, technical, and regulatory pressures.


One of the earliest and most influential developments in freight data exchange was UN EDIFACT. Records from the United Nations Economic Commission for Europe show that EDIFACT was approved as an international standard in 1987 under ISO 9735 to support structured electronic data interchange between trading partners.


This enabled digital communication at scale. However, EDIFACT focused on how messages were structured and transmitted, not on establishing a shared data model or common semantics.


Integrations therefore, relied on partner-specific mappings that had to be built, maintained, and updated over time.


The industry standardised how data moved, but not what the data meant.


In practice, this often shows up as the same shipment being re-entered, reconciled, or corrected multiple times as it moves between systems.


Transport modes evolved their data standards independently


Freight forwarding spans multiple modes, each shaped by its own operational realities, governing bodies, and technology cycles.


In air cargo, the International Air Transport Association explains that Cargo XML was introduced to modernise electronic communication between airlines, freight forwarders, and logistics partners.


IATA also acknowledges that legacy formats remain in active use, resulting in prolonged transition periods. More recently, IATA has outlined its ONE Record initiative, which aims to replace message-based exchange with a shared shipment data model accessed through APIs.


In container shipping, the Digital Container Shipping Association publishes Track and Trace standards that define common data structures and interfaces for shipment visibility. DCSA positions these standards as a way to ensure consistent interpretation of events and milestones across the container shipping ecosystem.


Customs data follows a different path again. The World Customs Organization positions its Data Model as a universal reference framework for cross-border regulatory data exchange. In practice, national customs authorities adapt the model to local requirements, creating variation that freight forwarders must manage on a country-by-country basis.


Together, these parallel evolutions explain why freight data remains fragmented even as physical shipping has become increasingly standardised.


Documents remain a major source of operational friction


Freight operations are still heavily document-driven. Bills of lading, manifests, customs declarations, and commercial invoices remain central to how cargo moves through global trade systems.


The International Federation of Freight Forwarders Associations describes its electronic Bill of Lading initiative as a framework for secure and interoperable digital bills of lading. FIATA also notes that adoption depends on acceptance by carriers, banks, insurers, and regulators across jurisdictions.


While progress is real, document workflows continue to expose gaps between digital ambition and operational reality.


Why containers standardised more easily than freight data


Containers succeeded because they were constrained by physical reality.


ISO standards define strict dimensional and structural requirements that allow containers to be lifted, stacked, and transported safely across ports, vessels, rail, and road. Without alignment, global container interoperability would not have been possible.


If systems do not align, shipments can still move through workarounds such as spreadsheets, emails, PDFs, or manual re-entry. Industry commentary from bodies including IATA and UNECE highlights that these workarounds reduce short-term pressure to converge on shared data models, even though they introduce long-term operational cost and risk.


The freight data ecosystem is also broader. A container primarily requires alignment across ports, carriers, and equipment providers. Freight data must align across forwarders, shippers, carriers, customs authorities, regulators, financial institutions, and software platforms, each with different incentives and timelines.


Why logistics standards alone are not enough


UNECE materials note that EDIFACT remains foundational but relies on bilateral implementation and ongoing maintenance.


IATA publications emphasise that initiatives such as Cargo XML and ONE Record only deliver full value when adoption is coordinated across the logistics ecosystem. DCSA similarly notes that carrier-level standardisation does not automatically guarantee interoperability downstream.


What freight operators actually need


For freight forwarders and logistics operators, the challenge is not a lack of standards. It is the operational burden created by their fragmentation.


What is needed is operational clarity. A single view across transport modes. Structured data that flows into customs processes without repeated translation. Workflow that removes duplication. Document handling that reduces manual intervention. Financial visibility tied directly to shipments rather than spread across disconnected systems.


These are operational needs, not theoretical ideals.


The takeaway


The shipping container transformed global trade because the industry aligned around a shared physical interface.


Freight data evolved differently. It accumulated standards, systems, and workarounds without a single unifying foundation. The result is a digital logistics landscape that still relies heavily on translation and manual effort.


If physical shipping were designed this way today, it would never scale.


The next meaningful shift in global freight will not come from digitisation alone. It will come from reducing fragmentation, so that freight data becomes as consistent, portable, and trusted as the container itself.