Current Job Openings
Job Title: Data Engineer
Job Description:
Our client is looking for a Data Engineer with strong development skills.
location: Remote
Job type: Contract (Long Term up to 2 years)
Pay: 65$/Hr C2C or W2
Education: Degree in Computer Science
Responsibilities:
- Design and implement reliable data pipelines to integrate disparate data sources into a single Data Lakehouse.
- Design and implement data quality pipelines to ensure data correctness and building trusted datasets.
- Design and implement a Data Lakehouse solution which accurately reflects business operations.
- Assist with data platform performance tuning and physical data model support including partitioning and compaction.
- Provide guidance in data visualizations and reporting efforts to ensure solutions are aligned to business objectives.
- Collaborate with cross-functional teams to integrate data from different systems and ensure data consistency and quality.
- Monitor and optimize data pipelines and database performance to meet business requirements and performance standards.
Qualifications:
Minimum education and experience required: Master’s degree or the equivalent in Computer Science, Information Technology, Engineering, or related field and 2 years and experience in software engineering OR Bachelor’s degree or the equivalent in Computer Science, Information Technology, Engineering.
Required Skills:
- Should have 1+ year of experience with PySpark, Python,T-SQL and Scala
- Should have 1 + year of experience with Azure or GCP
- Experience in GCP, Big Query.
- Design, Deploy, manage, and operate scalable, highly available, and fault tolerant ETL / Bi / Bigdata / Analytics systems on GCP.
- Should have 1+ Year of experience with Data pipeline building, Data modeling and ingestion & Consumption.
- Has through understanding and working experience with Azure platform services, with tools like ADF, ADLS and Data bricks (writing Notebooks for Data bricks).
- Should have experience with tools like KAFKA for streaming data handling.
- Experience in software development practices such as Design Principlesand Patterns,Testing, Refactoring, CI/CD, and version control.
Note: We do not work with third party agencies or vendors.
Interested candidates please forward your resume to HR@libertytechhub.com
Job Title: Hadoop DBA/Migration Engineer
Job Description:
Our client is looking for a Hadoop developer/Administrator with at least 2 – 3 years of hands-on Technical and Professional Expertise.
location: Bentonville, Arkansas
Job Type: Contract (Long term up to 2+ Years)
Pay: 60$/hr
Education: Bachelor’s Degree in Computer Science
Responsibilities:
- Hadoop Cluster Installation including Cloudera Manager Components Features, Cloudera Manager Installation, and CDH 5.x, 6.x Installation, Cloudera Data Flow.
- Resource Management with YARN including creating and managing resource queues with quotas, monitoring resource usage and performance tuning, Configuring Dynamic Resource Pools, Configuring groups with Static Service Pools, Configuring the FairScheduler.
- Cloudera Service Configuration Cloudera Manager Constructs for Managing Configurations, Locating Configurations and Applying Configuration Changes, Managing Role Instances and Adding Services, High Availability, Cloudera Navigator Administration, configuration and Management.
- Hadoop Security including Securing a Hadoop Cluster With Kerberos, TLS and HDFS encryption, Hadoop AC.
- Kafka Cluster Architecture, Setup, Administering Kafka, Kafka Performance Tuning
- On perm to Azure Migration Experience
- Azure Experience is definitely a plus.
Qualifications:
Master’s degree in Computer Science or related field and 2 years of experience in software engineering or related field. Strong experience in Big Data platforms such as, HADOOP, Teradata. Azure Experience.
