Job Description
Compression Expert
Remote
Contract
Years Of Experience: 7+
Job Description:
Subject Matter Expertise to include, but not limited to current lossless compression algorithms (Huffman, etc), how compression is implemented and optimized across the data supply chain in multiple high-volume settings (Cloud, data center, etc), data/cybersecurity, and Information Theory with a deep knowledge of entropy and the theoretical limits of compression (Shannon’s Source Coding Theorem)
Expert in developing and optimizing lossless algorithms
Mastery of C or C++ for memory management and performance-critical implementations including delta compression data management techniques
Expertise in Transform Coding, quantization, and psychovisual/psychoacoustic modeling for media-specific compression
Strong foundations in Statistics, probability distributions, and Linear Algebra for modeling complex data
Proficient in AI-Based Compression in high-volume environments using Neural Networks (e.g., Autoregressive Models) to model data patterns for higher compression ratios
Understanding SIMD (Single Instruction, Multiple Data) and GPU acceleration to parallelize encoding and decoding tasks
Performance Benchmarking: Ability to analyze trade-offs between compression ratio, memory usage, and latency
Critical thinking to identify redundancies in specialized data types and question theoretical standards
Understanding of advanced techniques like Bits-Back ANS that allow for lossless compression using latent variable models
Experience designing system integrations dictating how neural compression components fit into existing big data pipelines and cloud environments like AWS or Google