Job Title:
Data Engineer

Company: The Judge Group

Location: Wilmington, NC

Created: 2024-05-04

Job Type: Full Time

Job Description:

Title: Data EngineerWant to apply Read all the information about this position below, then hit the apply button.Location: Hybrid in Wilmington, DEW2 Only Contract to Hire (no sponsorship)NO C2C, MUST BE ON W2Conversion salary negotiableCompany Overview: We are a dynamic and innovative organization seeking a talented Data Engineer to join our team. As a Data Engineer, you will play a crucial role in designing, developing, and maintaining our data pipelines and ETL processes. If you're passionate about big data, cloud technologies, and creating scalable solutions, we'd love to hear from you!Responsibilities:ETL Development:Hands-on experience with ETL tools and frameworks (e.g., Ab Initio, Apache NiFi, Talend).Design, develop, and optimize data pipelines to extract, transform, and load data from various sources into our data lake.Programming Skills:Proficiency in Java (core Java) or other relevant programming languages (Scala or Python).Ability to write efficient, maintainable, and scalable code for data processing.Big Data Technologies:Familiarity with big data technologies such as Hadoop and Spark.Experience working with large-scale data sets in a complex organizational environment.Cloud Native Experience:Practical knowledge of cloud platforms, preferably AWS.Understanding of cloud-native architecture and best practices.Database and Querying Languages:Proficiency in SQL and NoSQL databases.Ability to model and query relational databases (RDBMS) and NoSQL databases.Event-Driven Architecture:Proven experience with event-driven/streaming architectures using tools like Kafka.Ability to design and implement real-time data processing solutions.Unix Shell Scripting:Familiarity with Unix/Linux environments and shell scripting.Troubleshoot and maintain data pipelines using shell scripts.Software Engineering Best Practices:Working proficiency with software engineering toolsets (e.g., Git, Jenkins, Docker).Collaborate with cross-functional teams to ensure code quality and operational stability.Nice-to-Have:Knowledge of AWS services (e.g., S3, Glue, EMR).Experience with Ab Initio for ETL development.Understanding of Python for data processing.Qualifications:Bachelor's degree in Computer Science, Information Systems, or related field.Minimum 4-5 years of relevant experience in data engineering or related roles.Strong problem-solving skills and ability to work independently.If you're passionate about data engineering, enjoy working in a collaborative environment, and want to contribute to cutting-edge projects, we encourage you to apply!