Benchmark Coverage

Industry Taxonomy

We are collecting tasks in 63 professional fields across 13 domains, from engineering and life sciences to healthcare, law, and creative media, each grounded in real workflows with verifiable outcomes.

Classification based onO*NET / SOC 2018+OECD FORD+NSF / NIH / DARPA frontier fields
Hover over a segment to explore

Industry Cards

Landscape snapshots and representative workflows

A curated catalog of professional fields, organized by domain group. Each card gives a quick landscape snapshot and three representative workflows that inform benchmark design.

⚙️

Engineering & Architecture

Aerospace Engineering

Aerospace engineering is the discipline of designing, analyzing, integrating, and verifying aircraft, spacecraft, and related flight systems across commercial aviation, defense, spaceflight, and specialist advisory environments. In practice, experts usually self-locate in recognized regions such as aerodynamic design and external-flow analysis, commonly executed in OpenFOAM or SU2; mission analysis, astrodynamics, and navigation, typically anchored in GMAT or Orekit; flight mechanics, handling qualities, and control-law validation, often run in JSBSim or MATLAB/Simulink Aerospace Blockset; structures, loads, and certification-facing verification, which in many programs are managed alongside Nastran-class finite-element environments; and vehicle integration, digital mission engineering, and operational scenario assessment, frequently organized in STK. The field spans both operator-side engineering inside OEMs, primes, airlines, and space operators, and advisory or test-facing work performed by research labs, regulators, and specialist consultancies. Across all these regions, the work is difficult for the same reason: geometry, physics models, reference frames, external data, controls, and regulatory evidence are tightly coupled, so small inconsistencies in mesh strategy, force models, timing standards, or requirement baselines propagate directly into downstream performance claims, safety margins, and certification or mission decisions.

Workflow 1OpenVSP + Gmsh + OpenFOAM
ONERA M6 Transonic RANS Validation
Workflow 2Gmsh + SU2
CRM Cruise Drag Prediction and Grid Convergence
Workflow 3GMAT + SPICE Toolkit
Earth-Moon Multi-Impulse Transfer Design

Architecture & Construction

Architecture & Construction is the digital planning, coordination, analysis, and delivery of buildings and built assets across design firms, engineering consultancies, contractors, specialist trades, and owner-side delivery teams. In professional-practice terms, the field spans architectural design and documentation in Revit; multidisciplinary design coordination and constructability review in Navisworks Manage; BIM management, model quality assurance, and information delivery in Solibri Office and BIMcollab; building performance and sustainability analysis in OpenStudio and EnergyPlus; and adjacent quantity and handover work, often reconciled through Revit schedules and Solibri rule sets, that turns design models into procurement, commissioning, and operational data. Experts usually self-locate by the discipline boundary they manage—authoring, coordination, assurance, performance, quantification, or handover—rather than by a single linear BIM pipeline, and both operator teams and advisory specialists work against the same exchange standards, issue logs, and model deliverables. Across all these areas, the hard part is keeping geometry, object identity, classifications, quantities, required attributes, issue states, and simulation assumptions consistent across multiple tools and file exchanges. Small errors in any one layer—space closure, host relationships, classification mapping, or systems metadata—propagate into coordination misses, incorrect takeoffs, failed handover checks, and misleading performance results.

Workflow 1Revit + Autodesk IFC Exporter
Author an Architectural IFC Model
Workflow 2Navisworks Manage + BIMcollab Zoom
Run Federated Clash Detection and BCF Issue Packaging
Workflow 3OpenStudio + EnergyPlus
Build a Whole-Building Energy Simulation

Chemical Engineering

Chemical Engineering is the discipline of designing, scaling, integrating, and improving chemical, energy, and materials processes across laboratory, pilot, and operating-plant settings. Experts typically self-locate in thermodynamics and property-method work, where Aspen Properties anchors phase-equilibrium and physical-property decisions; reaction engineering and scale-up, where Cantera anchors kinetic model development and reactor prediction; separations and purification, where ChemSep anchors rigorous distillation and absorption work; plantwide design, debottlenecking, and operating-point support, where Aspen Plus anchors recycle closure, utility balance, and integrated flowsheets; and optimization, techno-economics, and uncertainty analysis, where IDAES anchors equation-oriented optimization and costed decision support. The same field spans owner-operator work inside manufacturing sites and advisory, licensor, or design-package work supporting new process development, retrofits, and expansion studies. Across all these areas, technical difficulty comes from tightly coupled thermodynamics, kinetics, recycle structure, equipment limits, utility costs, and product specifications, so small errors in data treatment or model choice propagate directly into purity, energy, safety, and economic decisions.

Workflow 1DWSIM + COCO
Regress a Nonideal Property Package for a Separation System
Workflow 2Cantera + DWSIM
Fit Reactor Kinetics and Predict New Operating Conditions
Workflow 3ChemSep + DWSIM
Design a Rigorous Distillation Column from a Fixed Thermodynamic Basis

Civil & Structural Engineering

Civil & Structural Engineering is the professional practice of analyzing, designing, checking, and rehabilitating buildings, foundations, soil systems, and other load-bearing assets across consulting design, specialist review, and owner-side technical advisory work. Senior practitioners usually self-locate in building-structure design and serviceability checking, often centered on ETABS; code-based seismic design and independent review, often run in SAP2000; performance-based seismic assessment and retrofit, typically executed in Perform3D or OpenSees; site-response and ground-motion development, often carried in RSSeismic; and geotechnical, excavation, and soil-structure interaction work, commonly modeled in PLAXIS 2D/3D. Adjacent practice areas include construction-stage engineering, forensic investigation, and rehabilitation, which frequently reuse the same structural or geotechnical solvers under different contractual, evidentiary, and regulatory settings. Across all these areas, the hard part is not any single equation but the coupling of geometry, stiffness reduction, load combinations, mass source, constitutive behavior, groundwater, and code-defined demand, so small modeling errors propagate directly into drift, force, deformation, pore-pressure, and safety decisions.

Workflow 1ETABS + buildingSMART Validation Service
Multi-Story Building Gravity and Wind Analysis
Workflow 2SAP2000 + ASCE Hazard Tool
Response-Spectrum Steel Frame Seismic Design
Workflow 3Perform3D + PEER Ground Motion Database
Nonlinear Time-History Performance Assessment

Electronics Engineering

Electronics engineering is the professional practice of designing, validating, and releasing board-level electronic systems from schematic intent through manufacturable hardware and verified interconnect behavior. Senior practitioners usually self-locate by practice area: component library and model qualification using Ultra Librarian; board and subsystem design plus physical implementation in KiCad or Altium Designer; power conversion and control-loop design in PSpice; precision analog and mixed-signal front ends in LTspice; high-speed interconnect and signal-integrity signoff in HyperLynx; and fabrication release plus electrical-test preparation from CAD outputs and IPC-D-356 data. The field spans both operator teams inside OEMs, ODMs, and semiconductor vendors and specialist consultants who review layouts, power stages, analog chains, or channel compliance for external clients. Across these areas, the hard part is not any single schematic, model, or layout decision, but the coupling among netlists, package mappings, stackups, parasitics, solver assumptions, and manufacturing constraints: an error in one layer often invalidates timing, noise, stability, or testability results in another. Parser-valid models, constraint-clean layout, and release-clean fabrication data are tightly linked, so correctness cannot be judged from one artifact in isolation.

Workflow 1KiCad + Ultra Librarian
Constraint-Driven Multilayer PCB Layout
Workflow 2PSpice + PSpice for TI
Switching Power Loop Compensation and Transient Validation
Workflow 3LTspice + Qucs-S
Precision Analog Front-End Noise and Tolerance Signoff

Mechanical Engineering

Mechanical engineering is the discipline that converts geometry, materials, loads, thermal environments, and fluid conditions into defensible design decisions for products, equipment, and infrastructure across OEMs, suppliers, operators, test labs, and specialist CAE consultancies. Senior practice areas usually separate into structural integrity and durability, where Ansys Mechanical anchors stress, displacement, and release work; thermal management and thermo-mechanical reliability, often run in COMSOL Multiphysics; external aerodynamics and internal flow engineering, where Simcenter STAR-CCM+ and OpenFOAM support lift, drag, pressure-drop, separation, and heat-transfer studies; design optimization and multidisciplinary trade studies, where Dakota formalizes parameter sweeps and constrained search; and materials-property governance and simulation release support, often organized around Ansys Granta. In-house analysts and advisory teams move between these regions, but they remain distinct professional homes with different evidence standards and deliverables. Across all these areas, fidelity depends on the same tightly coupled layers: geometry cleanup, material cards, boundary conditions, mesh strategy, solver settings, and verification against reference data or conservation checks. Errors in units, wall treatment, contact definitions, or probe selection propagate quickly because model credibility is determined by the interaction between numerical setup, physical assumptions, and validation evidence.

Workflow 1SALOME + Code_Aster
Linear Static Strength Analysis with Mesh Convergence
Workflow 2SALOME + Elmer
Transient Thermal Analysis with Thermal Stress Mapping
Workflow 3Gmsh + SU2
External Aerodynamics CFD Validation

Mining & Geological Engineering

Mining & Geological Engineering is the professional practice of converting incomplete subsurface evidence into defensible resource, reserve, design, and operating decisions for mining companies, consultants, laboratories, and technical signatories. Experts usually self-locate in exploration and geological data management, where Micromine Origin and QGIS govern drillhole, assay, survey, and terrain control; geological interpretation and 3D modelling, typically in Leapfrog Geo; resource estimation and technical reporting, often centered on Datamine Studio RM under JORC, CIM, and CRIRSCO reporting boundaries; strategic mine design and economic optimization, where GEOVIA Whittle anchors pit-shell and phase work; reserve conversion and life-of-mine scheduling, commonly in Minemax Scheduler; and adjacent geotechnical, hydrogeological, and environmental boundary-setting, which often returns to QGIS-linked spatial constraint models. The field spans both operator and advisory settings: mine technical services teams, consultants, and competent persons work on the same models but under different assurance and disclosure obligations. Across all these areas, geology, survey geometry, geostatistics, slope and recovery assumptions, and reporting-code definitions are tightly coupled, so errors in any one layer propagate directly into tonnes, grade, cash flow, and reportable reserve boundaries.

Workflow 1Micromine Origin + QGIS | commercial + open source
Drillhole Data Intake, Standardization, and QA/QC
Workflow 2Leapfrog Geo + QGIS | commercial + open source
Geological Domain Modelling from Drillholes
Workflow 3Datamine Studio RM + PyGSLIB | commercial + open source
Ordinary Kriging Resource Estimation and Classification

Power & Energy Engineering

Power & Energy Engineering is the engineering and analytical practice of designing, validating, operating, protecting, and expanding electric power systems across utilities, grid operators, project developers, OEMs, and specialist consultancies. Recognizable practice areas include transmission planning and security assessment, typically executed in PSS®E or PowerWorld; distribution planning and DER integration, commonly centered on OpenDSS or CYME; protection and fault studies, often run in PowerFactory under IEC 60909 or in North American duty frameworks aligned to ANSI C37; interconnection and renewable integration studies, where PSS®E and pandapower are used to test POI performance, curtailment, and mitigation options; and model governance and case management, where engineers maintain RAW, DSS, and structured tabular case data so studies remain reproducible across teams. The field spans both operator-side work inside utilities and ISOs/RTOs and advisory-side work performed for developers, lenders, and asset owners. Across all these areas, the same model backbone is stressed under different criteria, so errors in topology, ratings, control settings, time-series alignment, or study assumptions propagate directly into voltage, thermal, fault, and interconnection conclusions. The difficulty is intrinsic because network physics, software-specific data models, and regulatory or standards-based acceptance rules are tightly coupled rather than independently checkable.

Workflow 1PSS®E + pandapower
Transmission AC Load Flow and N-1 Security Assessment
Workflow 2PowerFactory + OpenDSS
Distribution Short-Circuit and Device Duty Check
Workflow 3OpenDSS + CYME
Distribution PV Hosting Capacity by Node

Robotics & Autonomous Systems

Robotics & Autonomous Systems is the engineering and operational field concerned with building, integrating, validating, and deploying machines that perceive, decide, and act in physical environments across industrial robotics, mobile robotics, drones, and autonomous vehicles. In practice, experts usually sit in recognizable regions rather than a single end-to-end pipeline: systems integration and middleware engineering around ROS 2; calibration and sensor-geometry work using Kalibr; state estimation and localization using OpenVINS; LiDAR- or vision-based mapping and SLAM using LIO-SAM; manipulation and constrained motion planning using MoveIt; and closed-loop autonomy validation, benchmarking, and release support in environments such as CARLA. The field also includes operator-side deployment and debugging as well as advisory, benchmarking, and safety-assurance functions for teams selecting architectures, datasets, and validation regimes. Across all these areas, correctness depends on tightly coupled time bases, coordinate frames, robot models, sensor noise assumptions, and environment definitions; small errors in any one layer propagate into trajectory error, collision risk, or invalid performance claims. The field is difficult not because any single algorithm is obscure, but because estimation, planning, simulation, and evaluation each rely on upstream configuration choices that must remain consistent across software, hardware, and scenario definitions.

Workflow 1Kalibr + ROS camera_calibration
Camera-IMU Calibration for Visual-Inertial Systems
Workflow 2OpenVINS + evo
Stereo-IMU Visual-Inertial Odometry on EuRoC
Workflow 3LIO-SAM + CloudCompare
LiDAR-IMU SLAM and Map Comparison on Oxford Spires

Semiconductor Design

Semiconductor design is the practice of translating digital hardware intent into manufacturable, signoff-ready integrated-circuit layouts across product companies, internal CAD organizations, and external ASIC design-service teams. Experts typically locate themselves in logic synthesis and library mapping, where Yosys anchors RTL-to-gates work; physical implementation and routability closure, where OpenROAD anchors floorplanning, placement, CTS, and routing; timing signoff and ECO, where OpenSTA anchors multi-corner slack and electrical closure; physical rule verification and tapeout signoff, where KLayout anchors batch DRC review; and hierarchical integration plus flow enablement, where OpenLane2 anchors hard-macro assembly and packaged reference runs on SKY130 or GF180MCU. Adjacent but recognized work includes benchmark curation and platform qualification for open shuttle and MPW-ready reference runs. Across these regions, correctness depends on keeping RTL, constraints, Liberty and LEF views, extracted parasitics, rule decks, and macro abstractions version-aligned; a mismatch in any one view can invalidate timing, legality, or circuit equivalence at signoff.

Workflow 1Yosys + ABC
Logic Synthesis and Library Mapping
Workflow 2OpenROAD + OpenLane2
Physical Implementation and Routability Closure
Workflow 3OpenSTA + OpenROAD Resizer
Multi-Corner Timing Signoff and ECO
🔬

Physical Sciences

Astronomy & Astrophysics

Astronomy & Astrophysics is the field of converting observatory and simulation outputs into defensible physical inferences about stars, galaxies, compact objects, and large-scale structure across university groups, observatory pipeline teams, survey collaborations, archives, and theory programs. The major practice areas are instrument-level imaging reduction and mosaic production anchored in AstroDrizzle; stellar spectroscopy and abundance analysis anchored in iSpec; high-energy source reprocessing and spectral modeling anchored in CIAO; survey catalog calibration, crossmatch, and data-release work anchored in TOPCAT; cosmology and structure-formation modeling anchored in GADGET-4; and archive and calibration operations anchored in CRDS. Experts usually self-locate by data regime and inference style rather than by a single pipeline stage: observatory operators and archive teams own calibration and provenance, while PI-led analysis groups and survey teams own extraction, modeling, and publication-grade validation. The same field also includes adjacent reference-data work around external astrometric, photometric, and line-list standards. Across all these areas, results depend on tightly coupled detector calibration, coordinate systems, response models, external reference catalogs, and physical forward models, so small errors in one layer propagate into source properties, cross-survey consistency, and downstream scientific claims. The hard part is not any single transform but preserving physical traceability from raw files and calibration context through to the reported parameters and uncertainty budget.

Workflow 1calacs + AstroDrizzle
HST ACS Imaging Reduction and Calibrated Source Catalog
Workflow 2EsoReflex + iSpec
UVES Echelle Reduction and Stellar Parameter Inference
Workflow 3CIAO + XSPEC
Chandra ACIS Reprocessing and Point-Source Spectral Fit

Computational Chemistry & Materials Science

Computational chemistry and materials science is the quantitative use of atomistic and electronic-structure computation to evaluate molecules, biomolecular complexes, crystals, surfaces, and derived properties across pharmaceutical, chemical, energy, semiconductor, and specialist advisory settings. Experts typically self-locate in molecular quantum chemistry, where ORCA anchors geometry optimization, frequencies, and frontier-orbital analysis; structure-based molecular design, where AutoDock Vina anchors pose generation and virtual screening against protein binding sites; biomolecular simulation, where GROMACS anchors explicit-solvent stability and binding analyses; periodic materials and surface modeling, where Quantum ESPRESSO anchors crystal relaxation, equations of state, electronic structure, and adsorption studies; and computational operations and data stewardship, where AiiDA anchors high-throughput execution, provenance, and reproducibility for internal R&D groups, CROs, and scientific consulting teams. Advisory variants often sit beside operator teams, translating method choice, benchmark evidence, and computational risk into project recommendations or go/no-go decisions. Across all these regions, the hard part is not any single calculation but the coupling between structure preparation, model form, numerical settings, and reference data: protonation, basis sets or pseudopotentials, force-field choices, convergence criteria, and benchmark selection all change the answer, and a silent error in one layer propagates directly into downstream ranking, stability, or property claims.

Workflow 1ORCA + RDKit
Small-Molecule Conformer Search, HF Preoptimization, and DFT Verification
Workflow 2AutoDock Vina + GNINA
Batch Redocking and Pose Recovery for a Single Target Family
Workflow 3GROMACS + gmx_MMPBSA
Explicit-Solvent Protein-Ligand MD with MM/PBSA Readout

Earth & Atmospheric Science

Earth & Atmospheric Science is the professional practice of turning atmospheric, oceanic, cryospheric, and solid-Earth observations into forecasts, event characterizations, hindcasts, and climate diagnostics for national weather services, seismic networks, research institutions, utilities, and hazard-advisory organizations. Recognizable practice areas include operational and research weather prediction, typically built around WRF; seismic network operations and catalog production, commonly centered on SeisComP; earthquake source physics and inversion, anchored in Grond; coastal and regional ocean prediction, built on ROMS; global climate experiment design and intercomparison, managed through CESM; and forecast or model evaluation, commonly executed with METplus or ESMValTool against standard observation products. The field includes both operator roles that deliver time-critical outputs and advisory or research roles that interpret model skill, event uncertainty, and physical attribution for downstream users. Across these areas, the hard part is maintaining consistency among grids, calendars, boundary conditions, station metadata, velocity models, observation operators, and reference truth sets; small errors in any one layer propagate into biased forecasts, mislocated events, unstable integrations, or misleading performance scores because the data model and the governing physics are tightly coupled.

Workflow 1WRF + METplus
Regional Heavy-Precipitation Hindcast and Verification
Workflow 2SeisComP + NonLinLoc
Continuous-Waveform Earthquake Catalog Production
Workflow 3Grond + Pyrocko
Regional Moment-Tensor Inversion

Astronomy & Astrophysics

Astronomy & Astrophysics is the field of converting observatory and simulation outputs into defensible physical inferences about stars, galaxies, compact objects, and large-scale structure across university groups, observatory pipeline teams, survey collaborations, archives, and theory programs. The major practice areas are instrument-level imaging reduction and mosaic production anchored in AstroDrizzle; stellar spectroscopy and abundance analysis anchored in iSpec; high-energy source reprocessing and spectral modeling anchored in CIAO; survey catalog calibration, crossmatch, and data-release work anchored in TOPCAT; cosmology and structure-formation modeling anchored in GADGET-4; and archive and calibration operations anchored in CRDS. Experts usually self-locate by data regime and inference style rather than by a single pipeline stage: observatory operators and archive teams own calibration and provenance, while PI-led analysis groups and survey teams own extraction, modeling, and publication-grade validation. The same field also includes adjacent reference-data work around external astrometric, photometric, and line-list standards. Across all these areas, results depend on tightly coupled detector calibration, coordinate systems, response models, external reference catalogs, and physical forward models, so small errors in one layer propagate into source properties, cross-survey consistency, and downstream scientific claims. The hard part is not any single transform but preserving physical traceability from raw files and calibration context through to the reported parameters and uncertainty budget.

Workflow 1calacs + AstroDrizzle
HST ACS Imaging Reduction and Calibrated Source Catalog
Workflow 2EsoReflex + iSpec
UVES Echelle Reduction and Stellar Parameter Inference
Workflow 3CIAO + XSPEC
Chandra ACIS Reprocessing and Point-Source Spectral Fit
🧬

Life Sciences

Computational Drug Discovery

Computational drug discovery is the model-, structure-, and simulation-driven practice of finding, ranking, and optimizing therapeutic molecules across pharma, biotech, CRO, and platform organizations. In professional terms, the field splits into target and structure enablement, where PDBFixer converts experimental or predicted proteins into campaign-ready receptors; structure-based hit discovery, where Glide supports docking, enrichment testing, and pose review; ligand-based design and scaffold hopping, where Phase abstracts common interaction geometry from known actives; lead optimization and affinity prediction, where FEP+ informs congeneric series ranking before synthesis; DMPK/ADMET and multiparameter optimization, where ADMET Predictor translates chemistry into developability risk; and discovery informatics, where KNIME operationalizes standardization, assay joins, and auditable batch workflows for embedded project teams and centralized advisory groups. Across these regions, decisions are tightly coupled to protonation and tautomer handling, retained waters and cofactors, assay provenance, temporal split design, and project-specific gating rules, so small representation errors propagate directly into docking ranks, model metrics, synthesis priorities, and no-go calls.

Workflow 1PDBFixer + fpocket
Receptor Preparation and Pocket Definition
Workflow 2AutoDock Vina + gnina
Structure-Based Virtual Screening
Workflow 3RDKit + ZINCPharmer
Ligand-Based Scaffold Hopping

Genomics & Bioinformatics

Genomics & Bioinformatics is the professional practice of converting sequencing and phenotype data into analytical, clinical, public-health, and translational decisions for clinical laboratories, public-health agencies, biopharma programs, research cores, and specialist reference-lab teams. Major practice areas include human germline and population analysis, typically run in production on DRAGEN; somatic oncology and molecular profiling, typically anchored on Mutect2; long-read structural-variant resolution in difficult loci, often centered on Sniffles2; infectious-disease and public-health genomics, where Nextclade supports lineage and QC decisions; and rare-disease/clinical interpretation, where Exomiser drives phenotype-based ranking and VarSome Clinical supports evidence-managed review and sign-out. These practices span both operator and advisory variants, from pipeline ownership and assay support to expert interpretation and laboratory-facing review. Across all of them, validity depends on tight coupling among reference build, assay design, alignment and annotation versions, truth regions, phenotype encoding, and interpretation frameworks. A mismatch in any one layer can leave BAM, VCF, FASTA, or TSV outputs superficially valid while breaking comparability, benchmark scoring, or clinical defensibility.

Workflow 1BWA-MEM2 + GATK
Germline Short-Variant Calling with VQSR
Workflow 2minimap2 + Sniffles2
Long-Read Structural-Variant Detection in Medically Relevant Genes
Workflow 3BWA-MEM2 + Mutect2
Tumor-Normal Somatic Small-Variant Calling

Structural Biology & Protein Engineering

Structural Biology & Protein Engineering is the professional practice of determining, predicting, redesigning, and validating protein structures and assemblies for discovery, platform, and therapeutic use across academic labs, biotechs, pharma, CROs, and enabling software teams. Experts usually self-locate in predictive structure modeling, where ColabFold anchors sequence-to-structure inference; complex and assembly modeling, where HADDOCK supports multichain and restraint-guided docking; protein design and redesign, where Rosetta and ProteinMPNN drive fixed-backbone, interface, and sequence optimization; molecular simulation and conformational analysis, where GROMACS is used for all-atom dynamics and folding studies; and validation, benchmark, and release-quality review, where MolProbity is used to audit stereochemistry and structural plausibility. Adjacent practice includes antibody and enzyme engineering, experimental-constraint interpretation, and industrial deployment inside suites such as Schrödinger BioLuminate. Across these regions, difficulty comes from tightly coupled biological priors, database depth, chain mapping, coordinate representations, force-field assumptions, and evaluation metrics: choices made in MSA depth, template cutoff, restraints, protonation, or assembly definition propagate directly into confidence scores, interface quality, and downstream design decisions.

Workflow 1ColabFold + MolProbity
Monomer Structure Prediction and Geometry Audit
Workflow 2ColabFold + DockQ
Multimer Assembly Prediction and Interface Scoring
Workflow 3ProteinMPNN + OpenFold
Fixed-Backbone Sequence Design with Refold Validation
🏥

Health & Medicine

Biomedical Engineering

Biomedical Engineering is the professional practice of measuring, modeling, and validating human movement and load so that clinicians, researchers, device teams, and motion laboratories can convert raw sensing into defensible biomechanical decisions. Experts typically self-locate in one of five regions: clinical gait and rehabilitation analysis, commonly run in hospital or lab settings with Vicon Nexus; motion-laboratory data engineering and quality control, where BTK is used to audit C3D trials and force-plate assignments; markerless movement analysis, where OpenCap reconstructs 3D kinematics from synchronized video; wearable and ambulatory sensing, where Xsens Analyze turns IMU streams into gait events and joint trajectories outside the lab; and musculoskeletal and implant-load simulation, where OpenSim estimates joint moments, muscle forces, and contact loads. Work appears in both operator-heavy lab execution and advisory settings such as protocol design, device evaluation, and validation. Across all these areas, the hard part is not isolated computation but the coupling of calibration, synchronization, model scaling, coordinate conventions, and external-force mapping; small errors in any layer propagate into downstream joint angles, event timing, and internal-load estimates.

Workflow 1Vicon Nexus + pyCGM2
Optical Clinical Gait Analysis
Workflow 2OpenCap + OpenSim
Multi-Camera Markerless Gait Reconstruction
Workflow 3OpenSim + CEINMS
EMG-Informed Knee Contact Force Estimation

Clinical Informatics

Clinical Informatics is the discipline that designs, governs, and operationalizes digital clinical workflows, data exchange, and computable rules across care delivery organizations, public-health programs, vendors, and specialist advisory teams. In practice, experts usually self-locate in regions such as EHR content and workflow design, where Epic or OpenEMR anchor templates, questionnaires, and order-entry logic; interoperability and interface engineering, where NextGen Connect carries HL7 v2 transaction design and routing; API and patient-access enablement, where HAPI FHIR supports US Core and SMART on FHIR services; terminology and quality/governance work, where Snowstorm and VSAC anchor coded concepts, value sets, and measure logic; and clinical decision support, where CQF Ruler operationalizes CQL and CDS Hooks. Adjacent practice areas include public-health reporting, implementation optimization, and vendor-side integration or advisory work built on the same platforms. Across all of these regions, the hard part is not any single standard or tool but the coupling between local workflow design, terminology versioning, interoperability constraints, authorization models, and regulatory conformance: a small defect in one layer can propagate into incorrect clinical data capture, failed exchange, misleading decision support, or certification failure.

Workflow 1OpenEMR + LOINC FHIR Terminology Service
Configure a structured outpatient screening form
Workflow 2NextGen Connect 4.5.2 + NIST HL7 v2 Conformance Tools
Build an HL7 v2 immunization reporting interface
Workflow 3HAPI FHIR + Inferno
Stand up a US Core patient-access FHIR service

Epidemiology & Public Health

Epidemiology & Public Health is the practice of measuring disease occurrence, intervention effects, and population risk in order to guide surveillance, response, clinical evidence generation, and health-system decisions across health departments, hospital networks, CROs, sponsors, NGOs, and academic centers. Senior experts typically self-locate in public health surveillance and case reporting, where DHIS2 supports routine reporting and case-management operations; infectious disease modeling and short-horizon forecasting, where EpiNow2 is used for Rt estimation, nowcasting, and incident forecasts; clinical data standards and regulatory programming, where Pinnacle 21 anchors SDTM/ADaM conformance and submission QC; trial biostatistics and outcomes analysis, where SAS is used for survival, treatment-effect, and safety estimation; and spatial epidemiology, cluster detection, and disease mapping, where SaTScan and QGIS support hotspot detection, small-area risk estimation, and intervention targeting. Operator and advisory variants use the same methods but package them differently for ministries, delivery systems, sponsors, and consulting teams. Across all these areas, the hard part is the coupling of delayed reporting, censoring and time-at-risk definitions, geography and denominator alignment, and regulated metadata, so small errors in dates, population bases, or analytic cohorts propagate directly into forecasts, hazard estimates, cluster calls, and operational decisions.

Workflow 1DHIS2 + Epi Info
Weekly Case-Line Harmonization and Signal Review Pack
Workflow 2EpiNow2 + epinowcast
Nowcasting and 1–4 Week Hospitalization Forecast
Workflow 3admiral + Pinnacle 21 Community
SDTM-to-ADaM Derivation and Submission QC

Medical Imaging

Medical Imaging is the professional practice of turning clinical and research image data into registered, segmented, reconstructed, and quantitatively interpretable objects for diagnosis, treatment planning, device design, and scientific analysis. The field spans enterprise diagnostic post-processing and reporting, where radiology departments use Siemens syngo.via for advanced visualization and standardized measurement; radiation oncology and nuclear-medicine fusion work, where MIM Maestro supports multimodality registration, contouring, and propagation; surgical planning and device modeling, where Materialise Mimics converts reviewed segmentations into operative or manufacturing-grade anatomy models; neuroimaging morphometry and atlas-based research, where FreeSurfer underpins cortical and subcortical measurement; quantitative imaging core-lab analytics, where ANTs drives cross-modality registration and normalization; and imaging informatics and interoperability operations, where 3D Slicer is used to inspect, package, and exchange derived objects. These regions are practiced by hospital operator teams, academic labs, CRO-style imaging cores, and specialist advisors rather than by a single homogeneous function. Across all of them, the hard part is preserving voxel geometry, transform provenance, coded anatomy, and patient-linked object relationships across DICOM and research derivatives; small errors in spacing, orientation, or reference objects silently invalidate downstream segmentations, measurements, models, and decisions.

Workflow 1FreeSurfer + Freeview | Open-source | Linux, macOS
Brain T1 MRI Morphometry and Atlas-Based Quantification
Workflow 2TotalSegmentator + 3D Slicer | Open-source | Windows, macOS, Linux
Abdominal CT Organ Modeling for Surgical Planning
Workflow 3ANTs + FSL FLIRT | Open-source | Linux, macOS
Multimodality Brain Registration Benchmark

Radiation Oncology

Radiation oncology is the clinical and technical discipline of using ionizing radiation to treat cancer across hospital-based practice, academic centers, vendor ecosystems, and specialist physics consulting. Its field map is organized less by a single pipeline than by recognized practice areas: physician-led disease-site practice and prescription design, dosimetry and treatment planning in Eclipse or Monaco, adaptive therapy and re-irradiation assessment in MIM Maestro, patient-specific QA and independent dose verification in SunCHECK Patient, machine commissioning and linac performance analysis with pylinac, and research, benchmarking, and method development in matRad and CERR. Experts typically self-locate by disease site, planning modality, QA and verification scope, adaptive or cumulative dose work, or commissioning and validation responsibility; advisory variants appear in vendor implementation, TPS commissioning, and independent physics review as well as direct clinical operations. Across all these areas, the hard part is that structure nomenclature, image geometry, beam model assumptions, dose engines, delivery constraints, and QA criteria are tightly coupled through DICOM-RT and professional guidance such as TG-263, TG-119, TG-218, and related AAPM practice standards. Errors in any one layer propagate quickly because the same RTSTRUCT, RTPLAN, and RTDOSE objects drive planning decisions, secondary checks, machine delivery, and retrospective review.

Workflow 1matRad + CERR
Head-and-Neck IMRT Plan Generation
Workflow 2matRad + CERR
Independent Secondary 3D Dose Recalculation
Workflow 3matRad + PyMedPhys
TG-119 End-to-End Commissioning Validation
🧠

Psychology & Neuroscience

Computational Neuroscience

Computational neuroscience is the computational analysis, modeling, and simulation of neural systems across cellular, circuit, and whole-brain scales in academic labs, research institutes, neurotechnology groups, and translational R&D settings. Senior practitioners usually self-locate in five practice areas: data standards and curation, where neurophysiology programs package experiments in NWB and MRI programs organize studies in BIDS; optical physiology analysis, where groups use Suite2p to extract cells, traces, and events from two-photon recordings; single-cell morphology and biophysical modeling, where NEURON is used to fit conductance-based cell models to patch-clamp and SWC data; circuit and population modeling, where NEST is used to calibrate spiking networks against electrophysiology; and connectomics and whole-brain simulation, where TVB is used to fit subject-specific structural and functional models. The field includes both operator-heavy execution in wet-lab and methods-core environments and advisory or platform work around reproducibility, benchmarking, and model reuse. Across all these areas, correctness depends on keeping metadata, timebases, morphology topology, unit QC, atlas labels, and matrix ordering synchronized across acquisition files, standard formats, and simulators. Small mismatches rarely fail loudly, but they propagate into segmentation scores, fitted parameters, tuning statistics, and functional-connectivity estimates.

Workflow 1Suite2p + OASIS
Two-Photon Calcium Imaging Extraction and Response Quantification
Workflow 2PyNWB + AllenSDK
Neurophysiology Session Standardization into NWB
Workflow 3NEURON + BluePyOpt
Single-Cell Biophysical Model Fitting from Patch-Clamp and Morphology

Psychology & Neuroscience

Psychology & Neuroscience is the professional study and operational measurement of cognition, emotion, behavior, and brain function across academic labs, clinical research groups, imaging cores, assessment programs, and translational analytics teams. The field is typically segmented into cognitive and behavioral experimentation, where teams design and time tasks in PsychoPy; psychometrics and assessment validation, where measurement specialists fit latent-variable and IRT models in lavaan; neuroimaging acquisition and analysis, where imaging groups standardize and model task data through fMRIPrep; electrophysiology and ERP/MEG analysis, where signal-processing teams clean and quantify fast neural responses in EEGLAB; clinical and patient-reported outcomes programs organized around PROMIS instruments; and reproducibility-focused data curation, where shared datasets are structured and audited with BIDS Validator. Work spans both operators who run studies and advisory or methods specialists who set measurement, analysis, and reporting standards. Across all these areas, the hard part is the coupling of experimental design, item logic, acquisition metadata, artifact control, and inferential assumptions: timing drift, mis-keyed items, missing sidecars, or inconsistent event definitions propagate directly into effect estimates, scale scores, and cross-study comparability. The field depends on workflows that keep theory, data structure, and verification tightly aligned at each handoff.

Workflow 1PsychoPy + Neurodesign + G*Power
Scanner-Synchronized Event-Related Task Build
Workflow 2lavaan + semTools + mirt
Scale Calibration, CFA, and Measurement Invariance
Workflow 3MRIQC + fMRIPrep + FSL FEAT
Task-fMRI Preprocessing, GLM, and ROI Extraction
💼

Business & Finance

Accounting

Accounting is the professional discipline of converting enterprise transactions into controlled financial statements, tax positions, assurance evidence, and regulatory filings for operating companies, shared-service organizations, audit firms, and tax advisers. Senior practitioners typically locate themselves in financial reporting and consolidation, where SAP S/4HANA Group Reporting anchors group close and elimination logic; external reporting and disclosure, where Workiva structures filing assembly and tagged reporting; audit and assurance, where Caseware IDEA supports full-population testing; corporate tax, where ONESOURCE governs provision and return logic; indirect tax, where Avalara handles transaction-level determination; and adjacent controllership operations such as close, reconciliations, and subledger integrity, where BlackLine is often the operating control point. The field spans both operators running the books and advisers testing, challenging, or restructuring those outputs. Across all these areas, the hard part is not isolated calculation but keeping source transactions, master data, entity hierarchies, account mappings, currency treatment, tax attributes, and filing rules synchronized across multiple systems and reporting bases. A defect in any one layer propagates into reconciliations, audit evidence, tax positions, covenant metrics, and machine-readable filings.

Workflow 1SAP S/4HANA Group Reporting + Arelle
Multi-Entity Consolidation and Filing-Ready Reporting
Workflow 2Caseware IDEA + KNIME Analytics Platform
Full-Population Journal Entry and Revenue Cutoff Testing
Workflow 3ONESOURCE Tax Provision + SAP S/4HANA
Corporate Tax Provision and Deferred Tax Rollforward

Actuarial & Insurance

Actuarial & Insurance is the professional practice of turning exposure, loss experience, policy terms, and hazard information into prices, reserve opinions, portfolio risk views, and claim-handling decisions across carriers, reinsurers, MGAs, brokers, and advisory teams. Experts usually self-locate in pricing and product/rating work, where WTW Emblem supports model development and WTW Radar turns rating logic into deployable tables; reserving and financial valuation, where WTW ResQ supports triangle analysis, ultimate selection, and IBNR review; catastrophe exposure management, where Verisk Touchstone supports event-loss and exceedance-probability analysis; reinsurance analytics, where OasisLMF supports ceded-loss and treaty-structure studies; and claims operations, fraud, and severity management, where Guidewire ClaimCenter anchors FNOL-to-settlement workflows and operational queue design. Advisory variants mirror the same map in rate reviews, reserve opinions, catastrophe studies, and portfolio diagnostics. Across all of these areas, results depend on tightly coupled exposure definitions, valuation dates, policy and treaty terms, external event data, and system-specific data models, so a small mismatch in grain, timing, or financial logic propagates directly into price levels, reserve indications, loss curves, ceded recoveries, and claim-routing decisions.

Workflow 1WTW Emblem + WTW Radar
Personal Auto Rate Plan Build and Deployment
Workflow 2WTW ResQ + ChainLadder
Claim-Level Reserve Indication and IBNR Estimation
Workflow 3Verisk Touchstone + OasisLMF
Portfolio Catastrophe Loss and EP Curve Generation

Business Analytics

Business Analytics is the professional discipline that converts market, customer, transaction, and media data into research, segmentation, spend, and reporting decisions for in-house insights teams, growth operators, and advisory firms. Its major practice areas include survey programming and field management, typically run in Qualtrics; survey methodology, weighting, and tabulation, often executed in Q; customer segmentation and audience strategy, frequently built in Displayr; marketing effectiveness and media investment advisory, increasingly centered on Google Meridian and Meta Robyn; and executive reporting and BI delivery, commonly published in Tableau. Adjacent work includes retail and loyalty analytics, experimentation support, and analytics consulting that packages these outputs for brand, category, and finance stakeholders. Across all these areas, difficulty comes from reconciling instrument logic, sample design, respondent or customer identity, temporal aggregation, media taxonomy, and KPI definitions across incompatible systems. Small errors in one layer propagate quickly into false segments, misestimated contribution, and misallocated budget because the field depends on structurally consistent inputs long before it depends on presentation quality.

Workflow 1Qualtrics + Prolific
Survey Programming and Pilot Launch
Workflow 2Q + R survey
Weighting and Fieldwork Quality Audit
Workflow 3Displayr + Latent GOLD
Customer Segmentation and Profile Extraction

Finance

Finance is the professional practice of valuing assets, allocating capital, measuring exposure, and testing decision rules across corporations, asset managers, banks, insurers, and advisory firms. Experts usually self-locate in corporate finance and equity research, where Capital IQ anchors comparable-company and DCF work; treasury and rates analytics, where QuantLib is used to build discount curves and scenario packs; portfolio construction and asset allocation, where Bloomberg PORT supports holdings analysis and policy benchmarking; market and investment risk, where MSCI Barra frames factor exposure, VaR, and stress testing; systematic research, where LEAN is used for point-in-time backtests; and fund analytics and holdings transparency, where SEC N-PORT filings support public-fund reconstruction and review. Operator and advisory variants coexist: buy-side teams run portfolios and controls directly, while valuation, risk, and transaction specialists challenge or re-perform the same analyses for clients or committees. Across all these areas, point-in-time data availability, instrument conventions, calendars, corporate actions, cost assumptions, and regulatory or fiduciary rules are tightly coupled, so a small mapping or timing error propagates immediately into valuation, risk, and performance conclusions.

Workflow 1LibreOffice Calc + QuantLib
Public-Company DCF Valuation
Workflow 2QuantLib + LibreOffice Calc
USD Discount-Curve Construction and Scenario Pack
Workflow 3PerformanceAnalytics + rugarch
Multi-Asset VaR, ES, and Stress Testing

Human Resources

Human Resources is the management of workforce structure, employment records, talent flows, pay architecture, compliance reporting, and labor plans across enterprises, public-sector employers, and the advisors who support them. In practice, senior experts usually self-locate in core HR operations and HRIS, where Workday HCM governs employee, position, and organizational records; job architecture and organizational design, often maintained in SAP SuccessFactors HCM to normalize job families, levels, and reporting lines; talent acquisition operations, where Greenhouse Recruiting governs requisitions, candidate stages, and hiring velocity; compensation and total rewards, where Mercer anchors market pricing and salary bands; workforce planning and analytics, where Oracle Strategic Workforce Planning converts headcount, attrition, and demand assumptions into future-state plans; and adjacent employment reporting and benchmarking, often consolidated in Power BI against EEO-1, OPM, or labor-market reference sets. The field spans both operator teams inside HR, recruiting, and compensation functions and advisory specialists in market pricing, org design, and workforce analytics. Across all of these areas, the hard part is not isolated calculations but keeping effective-dated employee records, organizational hierarchies, requisition event logs, pay histories, geography factors, and external benchmark definitions aligned at the same time. Errors in any one layer propagate quickly because job architecture, compensation ranges, hiring metrics, regulatory counts, and workforce forecasts all depend on the same underlying identities, dates, and classification rules.

Workflow 1Workday HCM + KNIME Analytics Platform
Effective-Dated Workforce Snapshot Reconstruction
Workflow 2SAP SuccessFactors HCM + KNIME Analytics Platform
Job Architecture Standardization and Occupational Mapping
Workflow 3Mercer + Power BI
Salary Benchmarking and Band Design

Project Management

Project management is the discipline of translating scope, dependencies, resources, calendars, budgets, and delivery commitments into executable plans and control systems across capital projects, enterprise change programs, product organizations, PMOs, and external advisory engagements. The field is usually segmented into schedule development and critical-path planning, where Oracle Primavera P6 and ProjectLibre anchor detailed network logic; delivery planning and PMO coordination, where Microsoft Project and Microsoft Planner structure task hierarchies, milestones, and baseline governance; resource and skills-based capacity planning, where LibrePlan supports load balancing and qualification-aware assignment; project controls and forecasting, where Primavera Cloud and ProTrack support status-date updates, earned value, and finish-date prediction; and portfolio governance and schedule assurance, where Planisware and Deltek Acumen Fuse support prioritization, cross-project capacity tradeoffs, and diagnostic review. Both operator-led teams and advisory specialists work across these regions, often with the same schedule and cost data moving between systems. Across all these areas, the technical difficulty comes from the fact that dates, logic, calendars, resource ceilings, status updates, and control metrics are tightly coupled, so a modeling error in one layer can materially distort downstream forecasts, portfolio decisions, and assurance results. The field is therefore less about maintaining task lists than about preserving a coherent planning model under shifting operational and governance constraints.

Workflow 1ProjectLibre Desktop + Oracle Primavera P6
Baseline Schedule and Critical Path Build
Workflow 2LibrePlan + Microsoft Project
Skills-Based Resource Loading and Leveling
Workflow 3LibrePlan + ProTrack
Status-Date Update, Earned Value, and Forecast

Sales & Marketing

Sales & Marketing is the commercial field that manages how organizations create demand, convert interest into revenue, price offers, and measure commercial performance across in-house operating teams and external advisors. Its recognized practice areas include revenue operations and CRM administration in Salesforce Sales Cloud, pipeline management and forecast governance in Odoo CRM, demand generation and lifecycle marketing in HubSpot Marketing Hub and Mautic, marketing analytics and attribution in Google Analytics 4, and commercial pricing, quoting, and proposal execution in ERPNext. Adjacent regions include account-based programs, partner and channel motions, sales enablement, and proposal governance, typically expressed through the same object models, campaign taxonomies, and pricing controls rather than through separate disciplines. The work may sit with sales operations, marketing operations, revenue operations, analytics, sales engineering, or agency teams running the same systems on a client’s behalf. Across all these areas, difficulty comes from the coupling between object relationships, event timing, spend and revenue reconciliation, approval logic, consent and suppression rules, and price governance: a broken association, stale stage, unmapped UTM, or misapplied discount rule propagates directly into forecasts, audience eligibility, attribution, and quotes. Field performance therefore depends on simultaneous control of CRM data integrity, automation behavior, commercial policy, and auditable structured outputs.

Workflow 1HubSpot Import Tools + Odoo
CRM Object Import and Relationship Reconciliation
Workflow 2Odoo CRM + Metabase
Quarterly Sales Forecast and Pipeline Risk Decomposition
Workflow 3Mautic + SuiteCRM
Lifecycle Nurture Orchestration and Suppression QA

Supply Chain & Logistics

Supply Chain & Logistics is the discipline that turns demand, inventory, sourcing, network, and transport decisions into executable service and cost outcomes across manufacturers, retailers, distributors, 3PLs, and the advisory teams that redesign those systems. Recognized practice areas include planning data and model governance in SAP IBP, demand planning and forecast governance in Blue Yonder, concurrent supply-response and scenario control in Kinaxis, network strategy and footprint design in Coupa Supply Chain Modeler, inventory policy and multi-echelon simulation in anyLogistix, and transportation engineering through road-time analysis in OSRM. Experts usually self-locate by the decision layer they own—forecast quality, service and working-capital policy, footprint economics, or transport feasibility—even when the same business case spans several of them. Adjacent work includes benchmark design, master-data translation, and central planning governance for shared planning models. Across all these areas, errors in hierarchies, time buckets, lead times, road matrices, or service targets propagate immediately because forecast, inventory, network, and routing models share keys and constraints even when they run in different products. The technical burden is maintaining a single coherent planning model while simultaneously balancing service levels, working capital, fixed-network cost, and route feasibility.

Workflow 1SAP IBP + Kinaxis
Planning master data standardization
Workflow 2StatsForecast + HierarchicalForecast
Hierarchical retail demand forecasting
Workflow 3anyLogistix + Pyomo
Multi-echelon inventory policy optimization
⚖️

Legal

Digital Forensics

Digital Forensics is the disciplined acquisition, examination, interpretation, and defense of digital evidence from computers, removable media, application traces, and related data stores. The field spans law-enforcement laboratories, corporate investigations, incident-response teams, and expert-witness practices, and experts typically self-locate in evidence acquisition and preservation work centered on FTK Imager, endpoint and media examination conducted in Autopsy, artifact and application forensics built around RegRipper, investigative case reconstruction often managed in OpenText Forensic environments, timeline and correlation analysis driven by Plaso, and laboratory validation or training work anchored to CFReDS and other reference corpora. Across those regions, practitioners move between Windows, macOS, and Linux file systems, browser databases, registry hives, event logs, pagefiles, and structured evidence containers such as E01 and raw/DD, with both operator and advisory variants depending on whether the work is investigative, internal, or litigated. Across all these areas, evidentiary integrity, container handling, filesystem semantics, artifact parsing, and clock normalization are tightly coupled, so a mistake in imaging, offset interpretation, hash control, or timezone treatment can invalidate recovery, attribution, and defensibility at once while also colliding with chain-of-custody, privacy, and procedural constraints.

Workflow 1FTK Imager + AFF4 Imager
Multi-Media Forensic Imaging and Integrity Validation
Workflow 2Autopsy + PhotoRec
Cross-File-System Deleted and Journal-Based Recovery
Workflow 3Autopsy + RegRipper + Hindsight
Single-Host Exfiltration Path Reconstruction

Legal Research & Analysis

Legal Research & Analysis is the professional practice of converting court records, statutes, regulations, precedent, and patent disclosures into usable authority, filing support, and patentability conclusions for law firms, in-house legal teams, litigation support groups, and IP counsel. The field spans court-record and docket operations anchored in PACER and CourtListener; litigation merits, trial-motion research, and adverse-authority analysis centered on Westlaw; appellate briefing, citation quality control, and table-of-authorities work often run in Lexis+; public-law and rule research that reconciles statutes and regulations in those same research systems against official texts; patentability and invalidity search conducted in Espacenet; and prosecution-history or legal-status review grounded in USPTO Patent File Wrapper. It includes both operator-heavy workflows that assemble and normalize source materials and advisory workflows that select authority, assess risk, and shape filing-ready arguments. Across these areas, correctness depends on tightly coupled jurisdiction hierarchies, filing-version control, citation normalization, and patent-family deduplication: a wrong docket version, stale treatment signal, or uncollapsed family can propagate directly into adverse-authority analysis, quote support, legal-status calls, and patentability conclusions.

Workflow 1CourtListener + PACER
Federal Docket Package Assembly
Workflow 2Westlaw + PACER
Motion-to-Dismiss Precedent Research and Good-Law Validation
Workflow 3Lexis+ Brief Analysis + Lexis for Microsoft Office
Appellate Brief Citation Audit and TOA Rebuild

Legal Technology & eDiscovery

Legal Technology & eDiscovery is the professional practice of collecting, reducing, reviewing, logging, and producing electronically stored information and contract data for disputes, investigations, regulatory matters, and transactions across law firms, alternative legal service providers, forensic vendors, and in-house legal teams. Senior practitioners usually self-locate in forensic collection and processing, where Nuix anchors native ingestion, OCR, metadata extraction, and email threading; litigation review and early case assessment, where RelativityOne and Everlaw structure responsiveness review and high-recall prioritization; privilege and confidentiality analysis, where Relativity aiR for Privilege supports document-level privilege decisions and draft log descriptions; production and protocol compliance, where Everlaw manages processed-data exports and production packages; commercial contract diligence, where Kira extracts clause-level fields from large contract sets; and M&A agreement diligence, where eBrevia structures deal-point analysis for transaction teams. The field spans both operator-led execution and advisory-led defensibility design. Across all these regions, work quality depends on keeping text extraction, metadata fidelity, family relationships, legal doctrine, and platform-specific export rules aligned at the same time: a broken OCR layer, lost attachment link, overbroad privilege call, or malformed load file can invalidate downstream review, negotiation, or production even when the underlying legal judgment is otherwise sound.

Workflow 1Nuix Workstation + OCRmyPDF
Native ESI Processing and Review-Set Creation
Workflow 2Relativity Active Learning + OpenSearch
High-Recall Responsive Review Prioritization
Workflow 3RelativityOne + Relativity aiR for Privilege
Privilege Review and Privilege Log Generation
🎨

Visual & Media Arts

3D Modeling & Animation

3D Modeling & Animation is the professional practice of designing, building, animating, lighting, and finishing digital characters, environments, props, and shots for film, episodic, advertising, games, and visualization. Senior practitioners usually self-locate in asset creation and look development, where Maya and ZBrush anchor modeling, sculpt, topology, and UV decisions; character rigging and deformation, where Maya defines control systems and skinning behavior; animation and motion editing, where MotionBuilder is common for mocap cleanup and retargeting; lighting and rendering, where Arnold produces multilayer OpenEXR output for downstream use; compositing and finishing, where Nuke integrates CG with plates; and pipeline or technical-direction work, where OpenUSD governs scene layering, interchange, and handoff. The field includes both operator teams inside studios and vendors, and advisory or pipeline groups that standardize asset structures, scene schemas, and delivery packages. Across these areas, geometry, rig state, caches, scene graphs, render passes, and color transforms remain tightly coupled, so an error in scale, naming, coordinate space, or colorspace can survive several handoffs before surfacing as deformation failure, unusable AOVs, or a broken comp. Reliable execution depends on exchange-format integrity, pass completeness, and numeric checks on shape, contact, noise, and integration rather than on screenshots alone.

Workflow 1Blender + OpenUSD
Character Modeling, Retopology, and UV Layout
Workflow 2Maya + mGear
Biped Rigging and Skinning
Workflow 3MotionBuilder + Maya
Mocap Cleanup, Retargeting, and Shot Polish

Audio Engineering & Production

Audio engineering and production is the professional practice of turning multitrack recordings, production dialogue, designed effects, and premaster mixes into release-ready, broadcast-safe, or implementation-ready audio deliverables. The field is typically segmented into music recording and session editorial, where engineers organize and prep source material in Pro Tools; mix engineering, where balance, dynamics, and stem structure are built for final program output; post-production editorial and restoration, where dialogue, effects, and damaged source audio are repaired in Nuendo and iZotope RX; mastering and release or broadcast delivery, where WaveLab and Sequoia are used to normalize level, sequence programs, and package masters; and sound design plus asset-library management for film and games, where Soundminer supports search, tagging, and retrieval. These regions are staffed both by hands-on operators and by supervising leads who set delivery rules, QC thresholds, and interchange standards. Across all these areas, sample-accurate edit integrity, phase and polarity interactions, metadata completeness, and loudness or true-peak compliance are tightly coupled, so mistakes in one layer propagate directly into translation quality, downstream editorial reuse, searchability, and delivery acceptance.

Workflow 1Ardour + BWF MetaEdit
Multitrack Session Prep for Mixing
Workflow 2Pro Tools + Youlean Loudness Meter
Stereo Mix and Stem Print
Workflow 3WaveLab + BWF MetaEdit
EP Mastering and Sequence Assembly

Film & Video Post-Production

Film & video post-production is the professional practice of turning acquired picture and sound into reviewable, versioned, and deliverable screen content across editorial, finishing, and handoff environments. Experts typically self-locate in editorial operations and assistant editorial, where Avid Media Composer governs ingest, sync, bin structure, turnovers, and reconforms; picture editorial, where Adobe Premiere Pro is common for assembly, promo, and fast-turn narrative cutting; color finishing, where DaVinci Resolve and Baselight control look development, shot matching, HDR/SDR trims, and display transforms; VFX and compositing, where Nuke completes layered OpenEXR shots and matte workflows; audio post, where Pro Tools handles dialogue prep, premix, stems, and printmasters; and conform, mastering, and QC, where Resolve is used to rebuild locked timelines and validate deliverables. The field includes both operator-side execution inside post houses and studio-side supervision, vendor review, and turnover governance. Across all these regions, errors propagate because frame rate, timecode, media linking, track layout, color-space transforms, and delivery specs are tightly coupled: a one-frame sync offset, bad reel/name mapping, or wrong OCIO/ACES assumption can invalidate conform, grade, mix, and QC at once. The work is difficult less because any one step is exotic than because reference media, sidecar metadata, and interchange files must stay internally consistent across multiple applications and review contexts.

Workflow 1DaVinci Resolve + FFmpeg
Dual-System Sync and Multicam Grouping
Workflow 2Adobe Premiere Pro + OpenTimelineIO
Narrative Scene Cut and Turnover Package
Workflow 3DaVinci Resolve + OpenColorIO
ACES Version Grading and Shot Matching

Game Development

Game Development is the professional practice of designing, building, integrating, testing, and operating interactive game systems across first-party studios, co-development partners, technical art vendors, porting teams, and live-operations groups. Recognizable practice areas include game and systems design, where teams balance mechanics and progression inside Unreal Engine; level and world design, where playable spaces and encounter routes are blocked and iterated in Godot GridMap; art, animation, and technical art, where characters, props, and materials move from Blender or Maya into engine-ready scenes; gameplay and engine engineering, where GDScript implements mission state, interaction logic, and runtime tooling; runtime simulation and performance, where Godot scene systems stabilize physics, navigation, and animation behavior; and QA, build, release, and live operations, where Perforce P4 anchors versioned content, regression control, and deployment workflows. The same field map applies whether work is done by an internal studio team or by external co-development, QA, or porting specialists. Across these areas, asset formats, import settings, scene hierarchies, state logic, navigation, physics, and performance budgets are tightly coupled, so a scale error, broken signal chain, or unstable collider in one layer quickly propagates into broken traversal, invalid tests, and misleading balance data elsewhere.

Workflow 1Godot + Blender
Greybox Level Layout and Navigation
Workflow 2Godot + GDScript
Mission Loop and Interaction Scripting
Workflow 3Godot + Khronos glTF Validator
Character and Prop Integration with Materials and Animation

Graphic & Visual Design

Graphic & Visual Design is the professional discipline of converting text, sketches, photographs, and brand requirements into reproducible visual assets for print, digital publishing, packaging, marketing, and environmental applications. Senior practitioners usually self-locate in brand identity and design advisory, where Adobe Illustrator governs master marks and lockups; editorial and publication design, where Adobe InDesign structures long-form layouts and linked assets; illustration and icon-system design, where Krita supports authored visual systems; photographic post-production, where darktable manages non-destructive RAW development; production art and prepress, where Acrobat Preflight enforces press-readiness; and creative operations and asset governance, where ExifTool normalizes metadata, naming, and rights fields across mixed asset libraries. The field spans in-house creative teams, agencies, specialist studios, and downstream production vendors, with advisory work upstream and manufacturing-ready handoff downstream. Across all these areas, aesthetic decisions are inseparable from file formats, color profiles, font licensing, metadata, and export standards such as SVG, XMP, ICC, and PDF/X. Failures in any one layer propagate immediately because geometry, linked assets, color management, and rights metadata all have to survive cross-tool handoff without manual reinterpretation.

Workflow 1Adobe Bridge + ExifTool
Asset Intake and Metadata Normalization
Workflow 2Adobe Illustrator + Inkscape
Brand Identity Master and Variant Export
Workflow 3Adobe InDesign + Acrobat Preflight
Multi-Page Publication Layout and Preflight

XR / Immersive Media

XR / Immersive Media is the design, production, integration, and validation of interactive spatial experiences across VR, AR, and spatial-computing environments for studios, product teams, simulation groups, enterprise operators, and specialist consultancies. The field is usually organized into spatial experience production and technical art, commonly centered on Blender and Unity; high-fidelity simulation and premium real-time delivery, often built in Unreal Engine; mobile world-anchored AR activation, typically implemented through AR Foundation; marker- and image-tracked guidance experiences, which sit in the same mobile AR practice but operate under tighter sensing and asset constraints; native spatial-computing delivery for platform-specific products, anchored in RealityKit; and spatial audio design and acoustic verification, where FMOD Studio is the practical control surface for routing, state logic, and probe-based evaluation. Runtime integration, interchange, and standards QA cut across all of these areas and are commonly framed around OpenXR. Across these regions, units, coordinate frames, asset conditioning, tracking stability, interaction state, acoustic propagation, and frame-time budgets are tightly coupled, so small errors in one layer propagate into broken placement, degraded presence, or non-reproducible behavior across devices, runtimes, and operating-system boundaries.

Workflow 1Blender + Unity
Headset VR Scene Assembly and Performance Qualification
Workflow 2Unity + AR Foundation
Mobile AR Plane Placement, Occlusion, and Manipulation
Workflow 3arcoreimg + Unity
Image-Tracked AR Guidance Experience
💻

Computing & Data Science

AI/ML Engineering

AI/ML Engineering is the professional practice of turning datasets, pretrained weights, feature logic, and serving infrastructure into reproducible model systems that can be trained, deployed, monitored, and changed safely in production. Its practice areas include applied model development for vision, language, and tabular workloads in PyTorch; retrieval and embedding systems built around Sentence Transformers for corpus-query-qrels regimes; recommendation and personalization engineering using TorchRec over large sparse event logs; platform and feature operations anchored in Feast for point-in-time correctness and training-serving consistency; inference and performance engineering using Triton Inference Server for packaging, concurrency, and hardware-aware serving; and evaluation, drift, and operational governance using Evidently for shadow replay, regression review, and promotion control. The work appears both in operator teams running live product systems and in advisory or systems-integration teams standardizing stacks, migration paths, and control frameworks across business units. Across all of these areas, the hard part is the coupling among data versions, feature definitions, model artifacts, benchmark protocols, latency budgets, and policy constraints on privacy, safety, and model licensing. A change in any one layer can invalidate offline metrics, break deployment parity, distort rollback assumptions, or weaken auditability.

Workflow 1PyTorch + ONNX Runtime
Vision Model Fine-Tuning and Export
Workflow 2Sentence Transformers + FAISS
Dual-Encoder Retrieval Tuning and Index Build
Workflow 3Great Expectations + Feast
Tabular Pipeline with Shared Feature Definitions

Cybersecurity & Reverse Engineering

Cybersecurity & Reverse Engineering is the professional practice of turning opaque binaries, network activity, applications, and device firmware into defensible evidence about behavior, exposure, exploitability, and control failure. Senior practitioners usually self-locate in one of several recognized regions: malware analysis and threat research centered on Ghidra; incident response and network forensics anchored in Wireshark; exposure management and infrastructure validation using Nmap; application security and authorized penetration testing using Burp Suite Professional; and firmware and embedded-product security built around Binwalk. The field spans both operator and advisory variants: internal SOC, IR, and product-security teams work alongside external red teams, incident-response firms, and specialized reverse-engineering consultancies. In practice, adjacent work also reaches exploit validation, vulnerability research, and evidence production for legal, regulatory, or insurer-facing review. Across all these areas, the hard part is that truth sits in compiled code, protocol state, runtime side effects, and device-specific file systems rather than in human-readable process records. Static artifacts, dynamic behavior, target scope, and standardized taxonomies such as ATT&CK, CVE, and KEV have to reconcile cleanly, because mistakes in unpacking, emulation, protocol reconstruction, or scope control propagate directly into false attribution, missed exposure, or invalid proof.

Workflow 1Ghidra + x64dbg + CAPE Sandbox | Open-source | Windows guest + Linux host
Packed Malware Configuration and IOC Extraction
Workflow 2Wireshark + Zeek + Suricata | Open-source | Linux or analyst workstation
Enterprise PCAP Triage and Timeline Reconstruction
Workflow 3Nmap + Nuclei + OpenVAS | Open-source | Linux
External Attack Surface Validation and Exposure Triage

Data & Computer Science

Data & Computer Science is the professional practice of turning event, transaction, and operational data into governed metrics, statistical evidence, predictive models, and decision systems for digital products, operating teams, and advisory analytics engagements. The field is typically segmented into warehouse and measurement infrastructure, where BigQuery anchors raw event export and governed access; analytics engineering and metric governance, where dbt Core defines fact tables, dimensions, and testable business logic; product and business analytics, where JupyterLab supports funnel audits, segmentation, and anomaly review; predictive modeling, where XGBoost is used for tabular scoring and probability estimation; decision support and BI, where Apache Superset packages KPI views, filters, and drill paths; and experimentation, where GrowthBook supports controlled-experiment readouts, SRM checks, and rollout decisions. In practice, experts move between operator and advisory variants of the same work: building the system of record, auditing the metric layer, or interpreting model and experiment outputs for business action. Across all these areas, difficulty comes from coupled definitions rather than isolated analysis: nested event schemas, identity and session reconstruction, time-window conventions, train/test leakage controls, dashboard-to-metric consistency, and experimental validity checks must agree simultaneously. A quiet error in any one layer propagates into the next, so correctness depends on reconciling warehouse logic, statistical assumptions, and decision-facing outputs as one system.

Workflow 1DuckDB + dbt Core
GA4 Session Mart and Metric Governance
Workflow 2DuckDB + JupyterLab
Channel and Device Funnel Audit
Workflow 3XGBoost + MLflow
Standardized Tabular Model with Calibration

Mathematics & Operations Research

Mathematics & Operations Research is the professional practice of translating operational, commercial, and service-system decisions into formal optimization, constraint, and simulation models for enterprises, public agencies, consultancies, and applied research groups. Recognizable practice areas include network, routing, and logistics optimization, where Gurobi often anchors large mixed-integer deployments; production, project, and maintenance scheduling, where IBM ILOG CP Optimizer is common; workforce rostering and service staffing, where OR-Tools CP-SAT is used for hard/soft rule sets; planning, allocation, and custom mathematical programming, where Pyomo supports advisory and in-house model development; and stochastic operations simulation and digital-twin work, where AnyLogic is used to calibrate hospitals, factories, and service systems. Adjacent benchmark and model-assurance work sits alongside these operator functions, using public instance libraries, reference bounds, and checker-driven validation to compare alternative formulations and solvers. Across all regions, the hard part is not expressing a policy but maintaining semantic fidelity among raw instance data, objective definitions, discrete feasibility rules, solver tolerances, and backtesting conventions, because an error in any one layer can make a solution look plausible while invalidating comparability or executability.

Workflow 1Pyomo + SCIP
Vehicle Routing with Time Windows
Workflow 2MiniZinc + Chuffed
Multi-Mode Resource-Constrained Project Scheduling
Workflow 3OR-Tools CP-SAT + RosterViewer
Nurse Rostering

Quantum Computing

Quantum computing is the discipline of designing, compiling, executing, and validating computations that encode problems into qubits, gates, and measurements on simulators or quantum hardware. Recognized practice areas include quantum algorithm and circuit engineering, typically built in Qiskit; quantum chemistry and materials simulation, usually staged through PySCF before quantum execution; combinatorial optimization and hybrid heuristics, often implemented in PennyLane; compiler, transpilation, and hardware-execution engineering, where backend constraints and scheduling dominate work on IBM Quantum Platform; fault-tolerant architecture and quantum error-correction research, centered on Stim; and benchmarking and performance validation, where teams standardize workloads with MQT Bench. The field spans both operator and advisory variants: platform teams run calibration-sensitive execution stacks, while applied R&D and strategy groups assess which workloads, encodings, and resource assumptions are technically credible enough to pursue. Across all these areas, the hard part is not isolated algorithm design but the coupling between abstract problem formulations, device-specific gate sets and topology, noise and sampling budgets, and reference-truth generation: a mismatch in any one layer changes resource counts, observables, and the meaning of downstream performance claims.

Workflow 1Qiskit + pytket
Batch Hardware-Aware Transpilation
Workflow 2PySCF + Qiskit Nature
Small-Molecule Ground-State VQE
Workflow 3Qiskit + PennyLane
Max-Cut QAOA on a Benchmark Graph

Software Development

Software development is the professional practice of specifying, building, testing, securing, releasing, and operating software products and bespoke systems for product companies, enterprise IT groups, digital agencies, and implementation partners. Experts typically self-locate in application and product engineering, where teams shape user-facing and service behavior in Next.js; platform engineering and release management, where GitHub Actions and Kubernetes define build, deploy, and runtime control; quality engineering, where Playwright governs browser and API regression; production engineering and incident response, where Sentry supports fault isolation and rollback decisions; and application security and code assurance, where CodeQL is used for static analysis and dependency-risk triage. The field includes both operator teams running their own products and advisory teams modernizing, testing, or stabilizing client systems. Across all these regions, correctness depends on tight coupling between source-control state, dependency resolution, data fixtures, interface contracts, test oracles, deployment manifests, and runtime telemetry. A drift in any one layer can invalidate results elsewhere even when the code change itself is small.

Workflow 1Next.js + pnpm
Full-Stack Feature Delivery
Workflow 2Playwright + Newman
Browser and API Test Automation Buildout
Workflow 3Sentry + Chrome DevTools
Production Defect Reproduction and Repair
🛰️

Simulation & Operations

Aviation Operations

Aviation Operations is the professional practice of converting flight demand, surveillance, weather, aircraft performance, and airspace constraints into safe, legal, and economically viable execution across airlines, ANSPs, airport operations centers, and network managers. Core practice areas include airline dispatch and OCC control, typically executed in Lido Flight 4D; ATC surveillance and sector operations, anchored in TopSky-ATC; network flow and capacity management, anchored in NEST and FAA TFMS; aeronautical data, flight-plan conformance, and cross-border coordination, anchored in EUROCONTROL NM B2B; and aircraft-performance and trajectory analysis, often modeled in OpenAP. The field also includes advisory, systems-integration, and post-ops performance teams that work on the same operational stack with different decision horizons, from real-time control to replay, benchmarking, and regulatory review. Across all these areas, correctness depends on AIRAC-effective data, time-aligned weather and surveillance feeds, aircraft-performance assumptions, and jurisdiction-specific validation logic remaining mutually consistent. A mismatch in any one layer—route availability, sector geometry, report category, or trajectory timing—propagates directly into legality checks, fuel estimates, sector counts, reroute feasibility, and delay attribution.

Workflow 1Lido Flight 4D + NM B2B
Weather-Disrupted Multi-Flight Dispatch Replanning
Workflow 2NM B2B + NOP
Flight-Plan Legality and Route-Availability Validation
Workflow 3asterix + pyModeS
Surveillance Decode and Sector-Handoff Reconstruction

Drone / UAV Operations

Drone / UAV Operations is the professional practice of planning, executing, validating, and governing unmanned aircraft missions for mapping, inspection, monitoring, and autonomous flight. The field spans owner-operators, survey firms, utilities inspection teams, public-safety aviation units, and advisory or engineering groups that design procedures, simulate missions, or certify deliverables. Senior experts usually self-locate in survey mission planning and airspace compliance using QGroundControl, corridor and linear-asset operations using UgCS, geospatial constraint preparation and site modeling in QGIS, photogrammetry and reconstruction in WebODM, autonomy assurance and flight-software validation in PX4, and technical acceptance and geometry audit in CloudCompare. These regions are professionally distinct even when one program team covers several of them, because each has its own operating assumptions, deliverables, and failure modes. Across all these areas, airspace rules, terrain models, camera geometry, vehicle limits, weather, and geodetic reference frames are tightly coupled, so errors in one layer propagate into unsafe altitudes, unusable overlap, invalid control, or outputs that look visually plausible but are not operationally or survey-defensible.

Workflow 1QGroundControl + QGIS
Area Survey Mission Planning
Workflow 2UgCS + QGIS
Corridor Inspection Mission Planning
Workflow 3WebODM + CloudCompare
RGB Photogrammetry Reconstruction

Fire Science & Safety Engineering

Fire Science & Safety Engineering is the professional practice of quantifying fire, smoke, evacuation, and hazardous-atmosphere risk in buildings, infrastructure, and industrial sites through analysis, design review, consequence assessment, and emergency-planning support. In building fire engineering and performance-based design, teams use FDS to test fire growth, detector response, and local tenability; in smoke control and compartment analysis, CFAST is the common anchor for multi-room layer development and system timing; in egress and human-behavior studies, Pathfinder supports occupant movement, queueing, and route-capacity assessment; in hazardous materials consequence analysis and emergency planning, ALOHA is used for initial toxic or flammable release screening and MARPLOT for receptor and zone mapping; and in higher-fidelity industrial consequence work, OpenFOAM supports terrain- and obstruction-sensitive dispersion refinement. The field spans specialist consultants, owner-operator safety teams, insurers, independent peer reviewers, and regulators. Across all these areas, results are constrained by coupled physics, human response assumptions, threshold definitions, weather or ventilation boundary conditions, and model validity limits, so small errors in geometry, source terms, timing, or receptor placement propagate directly into tenability, evacuation, and protective-action decisions.

Workflow 1FDS + Smokeview
Single-Compartment Fire Dynamics and Detector Response
Workflow 2CFAST + Smokeview
Multi-Compartment Smoke Spread and Route Tenability
Workflow 3Pathfinder + FDS
Fire-Coupled Egress Performance Assessment

Urban Planning and Transportation

Urban planning and transportation is the professional practice of allocating growth, regulating development, and designing multimodal access across cities, corridors, and regions. Experts generally work in land-use and comprehensive planning, where CommunityViz is used to test growth and capacity scenarios; transportation planning and regional modeling, where PTV Visum anchors demand forecasting and assignment; traffic operations and corridor engineering, where SUMO is used for signal, queue, and bottleneck analysis; transit planning and accessibility, where OpenTripPlanner supports network redesign and travel-time equity analysis; and zoning, development review, and policy advisory, where QGIS structures parcel, overlay, and compliance work. The field spans both public-sector operators and advisory teams serving MPOs, transit agencies, DOTs, municipalities, and developers. Across all these areas, outputs depend on tightly coupled rule layers—parcel and zoning controls, network topology, service schedules, socio-economic forecasts, and observed counts or travel times—so errors in IDs, temporal alignment, or spatial coding propagate directly into forecasts, compliance conclusions, and capital decisions.

Workflow 1QGIS + AequilibraE
Peak-Hour Regional Road Assignment Calibration
Workflow 2QGIS + SUMO
Signalized Corridor Microsimulation Calibration
Workflow 3UrbanSim + QGIS
Parcel-Level Growth Forecast
📚

Education & Library

Educational Technology

Educational Technology is the professional practice of designing, configuring, operating, integrating, and analyzing digital teaching and learning environments across universities, schools, online program managers, publishers, and institutional teaching-and-learning teams. Its core regions include course design and delivery operations, typically anchored in Moodle or Canvas; assessment and grade operations, where Moodle Gradebook and Canvas Gradebook govern weighting, overrides, and final-grade release; platform integration and interoperability, where LTI and OneRoster connect external tools and student-information systems; learning records and analytics, where Open edX, Learning Locker, and Apache Superset turn event data into retention and engagement views; and adjacent governance work spanning accessibility, privacy, and academic-policy implementation in production systems. The field includes both operator roles inside institutions and advisory or implementation roles that migrate courses, rationalize tool stacks, and harden data flows across platforms. Across all of these regions, the hard part is not content authoring but maintaining semantic consistency across course structure, roster identity, grading logic, release conditions, third-party launches, and event schemas, because policy errors in any one layer propagate directly into learner access, final grades, compliance posture, and downstream analytics.

Workflow 1Moodle + H5P
Configure a Term Course Shell and Release Rules
Workflow 2Moodle + 1EdTech Common Cartridge
Migrate a Common Cartridge Course and Perform QA
Workflow 3Moodle Gradebook + OneRoster
Close a Multi-Section Gradebook and Produce SIS Export

Library & Information Science

Library & Information Science is the professional management of bibliographic description, access operations, discovery, and long-term stewardship of physical and digital collections across libraries, archives, consortia, and platform vendors. Recognizable practice areas include shared cataloging and authority control, where catalogers work in WorldShare Record Manager and MARCEdit; metadata remediation and migration, where legacy records are normalized and loaded through Koha import workflows; access services and circulation policy administration, where Koha governs patron, item, calendar, and notice logic; digital preservation ingest and packaging, where Archivematica turns transfers into managed preservation packages; preservation audit and format-risk analysis, where DROID profiles holdings at scale; and repository delivery and archival access, where DSpace exposes managed objects downstream. The field includes both operating roles inside institutions and advisory or implementation roles in consortium, migration, and platform settings. Across all these areas, correctness depends on tight coupling between descriptive standards, local policy, platform state, and external registries: an edition mismatch, broken authority form, bad circulation calendar, or invalid checksum propagates immediately into discovery defects, billing defects, ingest failures, or unverifiable preservation events. The technical difficulty lies in preserving machine-actionable integrity across schemas, rule engines, and audit trails rather than in any single record or file.

Workflow 1MARCEdit + WorldShare Record Manager
Full-Level Cataloging and Authority Control for Digitized Monographs
Workflow 2MARCEdit + FOLIO Data Import
Legacy MARC Remediation and Overlay Decisioning
Workflow 3Koha + Koha Reports
Circulation Policy Configuration and Transaction Execution
🌾

Agriculture & Environment

Environmental Science & Engineering

Environmental Science & Engineering is the professional practice of converting environmental observations, spatial data, and physical assumptions into defensible analyses for groundwater, air, ecosystems, forests, and related compliance decisions. Senior experts typically locate themselves in hydrogeology and groundwater modeling, where ModelMuse and MODFLOW 6 support aquifer, recharge, and capture-zone studies; air quality permitting and dispersion analysis, where AERMOD anchors source-impact and downwash work; ecological assessment and habitat modeling, where MaxEnt supports presence-only suitability mapping; forestry inventory and remote sensing, where PDAL and lidR convert LiDAR point clouds into tree- and stand-level metrics; and geospatial data engineering and environmental informatics, where QGIS underpins coordinate harmonization, raster alignment, and cross-domain packaging for both consulting teams and operating organizations. Adjacent advisory work includes permit support, agency review, and expert interpretation of model outputs for project approval, monitoring design, and resource management. Across these regions, difficulty comes from coupled spatial, temporal, and regulatory constraints: coordinate systems, vertical datums, meteorology, terrain, observation windows, sampling bias, and solver settings must align before any result is defensible. Errors introduced in preprocessing or parameterization propagate nonlinearly into calibration statistics, design concentrations, habitat extents, and inventory metrics because the field depends on chained transformations between GIS layers, numerical engines, and domain-specific validation criteria.

Workflow 1QGIS + GDAL
Multi-source Geospatial Harmonization for Environmental Modeling
Workflow 2ModelMuse + PEST++
Groundwater Flow Calibration and Capture-Zone Delineation
Workflow 3AERMOD + BPIPPRM
Permitting-grade Point-source Air Dispersion Modeling

Precision Agriculture

Precision Agriculture is the professional practice of converting georeferenced machine, soil, terrain, and remote-sensing data into field-specific agronomic decisions for large-scale crop production across grower operations, ag retail, consulting, OEM precision teams, and applied research. Recognizable practice areas include farm operations data capture and fleet recordkeeping in John Deere Operations Center; yield analytics, map cleaning, and historical performance review in Ag Leader SMS and Yield Editor; soil conductivity, salinity, and sampling design in ESAP; spatial agronomy and management-zone design in QGIS; prescription authoring in Ag Leader SMS or John Deere Operations Center; and machine interoperability, ISOXML exchange, and execution audit in AEF Taskdata Validator. The field includes both operator-led execution and advisory-led interpretation: farm teams emphasize machine-ready exports and seasonal decisions, while consultants and agronomists standardize zone logic, recommendation rules, and validation across fields and clients. Across all these areas, technical difficulty comes from the coupling of moisture normalization, sensor lag, coordinate systems, sampling density, spatial interpolation, controller constraints, and file-exchange standards, so a result can look agronomically plausible yet still be numerically wrong, operationally brittle, or unreadable by the target machine.

Workflow 1Yield Editor + QGIS
Harvest Log Cleaning and Calibrated Yield Mapping
Workflow 2QGIS + PAT
Multi-Year Management Zone Delineation
Workflow 3ESAP + QGIS
Soil Sampling Design and Nutrient or Salinity Surface Modeling
📊

Social Sciences

Social Science & Economics

Social Science & Economics is the empirical and model-based analysis of households, firms, markets, labor systems, and macroeconomic aggregates for research, policy, regulatory, and advisory use. Experts typically self-locate in policy evaluation and public-sector advisory, where Stata anchors panel and event-study work; labor and household microdata analysis, where the IPUMS CPS extract system standardizes weighted repeated cross sections; industrial organization and antitrust work, where PyBLP supports demand estimation and merger simulation; macroeconomic structural modeling for central banks and ministries, where Dynare is the DSGE engine; official-statistics and nowcasting practice, where X-13ARIMA-SEATS governs seasonal adjustment; and replication and data-publication work, where openICPSR packages reference data and code for published results. The field spans both operator roles inside statistical agencies, central banks, and research units, and advisory roles in competition economics, policy consulting, and applied research. Across these areas, results depend on tight coupling between source-data vintages, survey weights, classification systems, identification assumptions, solver behavior, and publication conventions, so a small error in one layer can flip treatment effects, elasticities, impulse responses, or forecast revisions downstream. The technical difficulty is preserving the correct economic and institutional meaning of raw files as they are transformed into estimable objects, diagnostics, and counterfactual outputs.

Workflow 1did + fixest
Heterogeneous Minimum-Wage Policy Evaluation
Workflow 2IPUMS CPS extract system + Stata
Weighted CPS Labor Heterogeneity Table
Workflow 3PyBLP + Python
Random-Coefficients Demand Estimation and Merger Counterfactual

Is your field missing?

We are actively expanding coverage. Domain experts are the heart of this benchmark.

Propose a New Domain →