Sign In
 [New User? Sign Up]
Mobile Version

Data Ingestion Engineer

AT&T


Location:
El Segundo, CA
Date:
06/22/2017
2017-06-222017-07-22
Job Code:
att4-4799651
Apply on the Company Site
  •  
  • Save Ad
  • Email Friend
  • Print
  • Research Salary

Job Details

Company AT&T

Job Title Data Ingestion Engineer

Jobid att4-4799651

Location: El Segundo, CA, 90245, USA

Description AT&T Is leading the
way to the future – for customers, businesses and the industry. We’re
developing new technologies to make it easier to stay connected to their world.
With a network that covers 225 countries, including more than 120 million
customers, we’d say we’re well on our way. Together, we’ve built a premier
integrated communications company and an amazing place to work and grow.



AT&T Entertainment
Group (AEG) is the world’s largest pay TV provider and the undisputed
leader in sports programming. The Broadcast Software Engineering (BSE) – Data
Architecture group provides the Big Data strategic directions to ingest,
process and deliver Big Data. Our vision is to provide better business insights
by analyzing data in our Big Data platform.



**Key Responsibilities**



+ Strong knowledge base in Java, J2EE, Microservice, Kafka, Hadoop,Spark, Ranger, Knox and big data technologies.

+ Mentoring the Jr. Developers and achieve all the technology goals.

+ Working with various cross -functional teams to deliver robusttechnology solutions.

+ Reduce technical debt and provide inputs to the product backloggrooming sessions.



**Work Environment**



This position is located in Elsegundo, CA. This fast-paced high
tech environment is perfect for individuals seeking to exercise innovation and
out-of-the-box thinking to move our state of the art technology to the next
level.



**Skills & Attributes**



The ideal candidate is a highly motivated and creative senior
level software engineer who is ready to participate in a team environment and
contribute to the success of AT&T Entertainment Group. The successful
candidate will possess excellent skills in organization and time management
where scheduling priorities and efficiency in system configuration is
essential. A good understanding of industry standard Hadoop / Kafka / Nifi
architecture , Micro-services architecture, toolset and best practices is
essential. The selected candidate will possess good communication skills
(verbal and written).



**Qualifications**



+ Master in Computer Science with 6 years of experience or Bachelor in Computer Science with 8 years of experience

+ Experience working with complex software in a parallel processingenvironment gained through a combination of academic studies and workexperience

+ Experience in the following technologies:



+ Unix based OS (RHEL/OEL is mandatory)

+ Hadoop , Spark, NiFi, Ranger, Knox

+ General kafka architecture consists of kafka topic (partitions, segments, kafka-log, replication,assignments, ISR, partition leader, under replicated state, offsets), kafka broker, producer, consumers, kafka mirror-maker, monitoring kafka cluster

+ Kafka console tools, such as: console consumer/producer, simpleconsumer, offset checker, dump log segment, OffsetShell), zookeeper,zknodes,zkCli, zookeeper ensemble (leader, follower, fault tolerance), monitoring zookeeper

+ Shell/Bash scripting

+ DevOps tools: Ansible, Docker

+ Graphite/Grafana stack, Kafka Manager, Apache Ambari, tools formonitoring kafka lag (Burrow, Kafka Offset Monitor, Kafka Lag Monitor), tool tocollect OS metrics like a Collectd

+ TCP/IP and understanding how to troubleshooting a networkconnectivity between hosts, how to check a local and destination portavailability, etc.

+ AWS , rackscape cloud

+ Java

+ CI/CD (Jenkins, Ansible)



**Additional Qualifications**



Good knowledge of the following technologies:



+ Kubernetes

+ Elastic Search

+ Confluent Kafka Connect

+ Python

+ Hadoop, HDFS



Apply on the Company Site

Featured Jobs[ View All ]

Featured Employers [ View All ]