Qlik Data Engineer
Description
Summary
This position is NOT eligible for visa sponsorship.
This role will specialize in building comprehensive data pipeline development and management, enabling our current Business Intelligence team to focus on analytics and business value while ensuring robust, scalable data integration solutions.
Background and Current State
Our Business Intelligence team currently operates as a multi-disciplinary unit managing the complete data lifecycle from ingestion to visualization. The current structure requires our BI professionals to wear many hats, handling responsibilities that span data engineering, ETL/ELT development, data modeling, report creation, dashboard development, and business relationship management. While this approach has served us well in establishing our data capabilities, the increasing complexity of our data ecosystem and growing business demands have created capacity constraints and specialization challenges.
Our data integration landscape has evolved significantly with the adoption of Qlik Data Integration and Qlik Talend Cloud Enterprise Edition. The current team's broad responsibilities limit the depth of specialization possible in any single area, particularly in the technical aspects of modern real-time data integration and the advanced features available in Qlik Talend Cloud Enterprise Edition. As our organization increasingly requires real-time analytics, operational reporting, and seamless data movement across hybrid cloud environments, we need dedicated expertise to ensure our Qlik platform delivers optimal performance and business value.
Primary Job Responsibilities
Data Integration Architecture and Engineering
- Develop and maintain ETL/ELT data pipelines leveraging Qlik Data Integration for data warehouse generation in bronze, silver, gold layers
- Build consumer facing datamarts, views, and push-down calculations to enable improved analytics by BI team and Citizen Developers
- Implement enterprise data integration patterns supporting batch, real-time, and hybrid processing requirements
- Coordinate execution of and monitor pipelines to ensure timely reload of EDW
Technical Implementation and Platform Management
- Configure and manage Qlik Data Integration components including pipeline projects, lineage, data catalog, data quality, and data marketplace
- Implement data quality rules and monitoring using Qlik and Talend tools
- Manage Qlik Tenant, security, access and manage Data Movement Gate way
Performance, Monitoring, Governance and Management
- Monitor and optimize data replication performance, latency, and throughput across all integration points
- Implement comprehensive logging, alerting, and performance monitoring
- Conduct regular performance audits and capacity planning for integration infrastructure
- Establish SLA monitoring and automated recovery procedures for critical data flows
Collaboration and Enterprise Support
- Provide technical expertise on Qlik Data Integration best practices and enterprise patterns
- Support database administrators and infrastructure teams with replication and integration architecture
- Lead technical discussions on data strategy and platform roadmap decisions
Key Qualifications
Required Skills
- Bachelor's degree in Computer Science, Information Systems, or related technical field
- 4+ years of experience in enterprise data integration with at least 2 years of hands-on Qlik or Talend experience
- Strong understanding of change data capture (CDC) technologies and real-time data streaming concepts
- Strong understanding of data lake and data warehouse strategies, and data modelling
- Advanced SQL skills with expertise in database replication, synchronization, and performance tuning
- Experience with enterprise ETL/ELT tools and data integration patterns
- Proficiency in at least one programming language (Java, Python, or SQL scripting)
Preferred Qualifications
- Qlik Data Integration certification or Talend certification (Data Integration, Data Quality, or Big Data)
- Experience with cloud platforms (AWS or Azure) and hybrid integration scenarios
- Experience with Snowflake preferred
- Understanding of data governance frameworks and regulatory compliance requirements
- Experience with API management and microservices architecture
Soft Skills
- Strong analytical and troubleshooting capabilities with complex integration challenges
- Excellent communication skills with ability to explain technical integration concepts to business stakeholders
- Collaborative approach with experience in cross-functional enterprise teams
- Detail-oriented mindset with commitment to data accuracy and system reliability
- Adaptability to evolving integration requirements and emerging technologies
This position is NOT eligible for visa sponsorship.
EEO Statement: Victaulic is an Equal Employment Opportunity (EOE/M/F/Vets/Disabled) employer and welcomes all qualified applicants. Applicants will receive fair and impartial consideration without regard to race, gender, color, religion, national origin, age, disability, veteran status, sexual orientation, genetic data, or other legally protected status
#TOP123
#LI-KM1
#LI-REMOTE
Victaulic Staffing Partner Communication Policy
All staffing agencies are strictly forbidden from directly contacting any Victaulic employees, except those within the Human Resources/Talent Acquisition team. All communications, inquiries and candidate submissions must be routed through Victaulic's Human Resources/Talent Acquisition team. Non-compliance with this policy may result in the suspension of partnership, cancellation of the current contract, and/or the imposition of a mandatory probation period before any future business can resume. Additionally, non-compliance may lead to a permanent ban on future business. This policy ensures a streamlined and compliant recruitment process.