About Onit
We're redefining the future of legal operations through the power of AI. Our cutting-edge platform streamlines enterprise legal management, matter management, spend management and contract lifecycle processes, transforming manual workflows into intelligent, automated solutions.
We’re a team of innovators using AI at the core to help legal departments become faster, smarter, and more strategic. As we continue to grow and expand the capabilities of our new AI-centric platform, we’re looking for bold thinkers and builders who are excited to shape the next chapter of legal tech.
If you're energized by meaningful work, love solving complex problems, and want to help modernize how legal teams operate, we’d love to meet you.
About the Role:
We are looking for a highly experienced Data Warehouse Architect to design, optimize, and scale our analytics platform. This role is hands-on and performance-focused, requiring deep expertise in PostgreSQL query tuning, strong experience using Datadog for performance analysis, and advanced Python scripting for ETL pipelines.
You will own the performance, reliability, and scalability of the data warehouse while partnering closely with engineering, analytics, and business teams.
What You’ll Do
Design and evolve a scalable data warehouse architecture for analytics and reporting.
Analyze and optimize PostgreSQL queries by reviewing execution plans and Datadog APM metrics.
Identify slow queries, bottlenecks, and resource hotspots; implement indexing, partitioning, and query rewrites.
Tune PostgreSQL configuration for large analytical workloads.
Build and optimize Python-based ETL/ELT pipelines for data ingestion and transformation.
Ensure ETL jobs are efficient, reliable, and well-monitored.
Create dashboards, alerts, and KPIs in Datadog to monitor warehouse health.
Enforce data quality, consistency, and governance standards.
Perform root-cause analysis for performance issues and production incidents.
Mentor engineers on SQL optimization, data modeling, and ETL best practices.
Core Skills & Technologies
Data & Performance
PostgreSQL (expert-level query tuning)
Execution plan analysis (EXPLAIN / EXPLAIN ANALYZE)
Indexing, partitioning, schema optimization
Observability
Datadog (APM, DB monitoring, dashboards, alerts)
ETL & Scripting
Python (advanced scripting)
ETL / ELT pipeline design
Data validation and transformation logic
Architecture
Data warehouse modeling (star/snowflake schemas)
Performance-driven analytics design
What We’re Looking For
6-9+ years of experience in data engineering, data warehousing, or database architecture.
Proven expertise in PostgreSQL performance tuning at scale.
Strong hands-on experience using Datadog to analyze and troubleshoot database performance.
Advanced Python skills for ETL, automation, and data processing.
Deep SQL expertise with experience optimizing complex analytical queries.
Strong communication skills and ability to collaborate across teams.
Ownership mindset with a focus on reliability, performance, and quality.