Handling external file workflows in D — best approach for consistency across runs?

Alisha Winson alishawinson2812 at gmail.com
Sun Apr 26 17:55:56 UTC 2026


Hi all,

I’ve been experimenting with D for a small tool that processes 
and organizes media files (mostly short video clips and related 
metadata). The goal is to build a simple pipeline that scans 
directories, groups assets, and prepares them for downstream use. 
The basic flow is:

Read files from a structured directory (clips, images, metadata 
from [free video editor]( https://cepcutapk.com/) )
Perform some lightweight processing (naming, grouping, timestamp 
alignment)
Output a cleaned/organized set of assets for later use

The issue I’m running into is consistency across runs. Even when 
the input set doesn’t change, I occasionally see:

Slight differences in file ordering
Inconsistent timestamp handling depending on source
Edge cases where files are skipped or processed in a different 
sequence

This becomes noticeable when those assets are later used in tools 
like CapCut, where ordering and timing really matter for 
maintaining a predictable timeline. What I’m trying to figure out:

What’s the most reliable way in D to enforce deterministic file 
ordering (especially across platforms)?
Are there recommended patterns for handling timestamps 
consistently (filesystem vs metadata)?
Any best practices for building repeatable file-processing 
pipelines in D?

I’m currently using standard library modules (std.file, std.path, 
etc.), but I’m open to restructuring things if there’s a more 
idiomatic or robust approach.

Would appreciate any insights or pointers.

Thanks!


More information about the Digitalmars-d mailing list