|Title:||Senior Data Brokering Engineer (Kafka)|
|Location:||Washington DC Metro Area|
Netorian is seeking a Senior Data Brokering Engineer (Kafka).
The Senior Data Brokering Engineer (Kafka) is responsible for architecting data brokering solutions to improve security value, service management, and scalability for our clients. The successful candidate will be a strong technologist with a practical yet creative mind. The successful candidate must be able to clearly and successfully communicate and demonstrate a clear understanding of business and technical requirements of the client and will be able to effectively collaborate with client information security and IT/IS teams as well as project and program managers to develop and deliver world class results.
In this role, the Senior Data Brokering Engineer (Kafka) reports to the Project Manager and manages a small team (2-3 direct reports). Additional responsibilities include assessing, optimizing, and integrating the customer’s sensor array such that delivery of the data to the appropriate Kafka topic is as accurate and timely as possible. This is a client-facing role that requires being able to coordinate effectively between more than one reporting structure, working closely with client IT security and IT/IS capabilities in addition to Netorian management and team members. This role does not require significant travel.
- Full-time, exempt.
- Competitive, based on qualifications and experience.
- National Capital Region and Ft. Meade area. (remote work opportunity may be available after initial development is completed).
- 25% or less.
- Must possess an active DoD Top Secret security clearance and be SCI eligible.
- Must successfully pass a criminal background check and drug screening.
- A bachelor’s degree in computer science, data science, or applicable field is required.
- Four (4) years of demonstrable equivalent experience specific to data brokering, Apache Kafka, etc. may be considered as meeting the education requirement.
Certifications & Training:
- CompTIA Security+ certification required.
- CISSP certification is desired.
- PMI PMP certification is desired.
- ITiL Foundations certifications desired.
Experience & Skills:
- Ability to rapidly understand client’s business strategies and the capability to apply creative problem-solving skills to deliver high-impact solutions to meet business needs is a must.
- Expert-level knowledge and expertise working with either Apache Kaka, MicroFocus Event Broker, or Confluent Enterprise Community Edition is required.
- Demonstrable experience in a leadership position on a major open source project or initiatives involving security research or design/framework, information management, data and content modeling, and large data analytics is strongly desired.
- Seven-plus (7+) years of demonstrable applied technology experience in data brokering and DCO operations defining strategy around security monitoring, incident management, regulatory compliance, and process improvement is required.
- Four-plus (4+) years of demonstrable design, engineering, configuration, and maintenance data brokering experience (Kafka) is required.
- One-plus (1+) years of demonstrable experience developing Kafka Streams applications is desired.
- One-plus (1+) years of demonstrable experience related to data enrichment services and tools is required.
- One-plus (1+) years of demonstrable experience working with LogStash is required.
- Expert-level sensor grid architecture knowledge and expertise.
- Expert-level Unix / Linux knowledge and expertise.
- Familiarity with Cyber Kill Chain methodologies.
- Familiarity with Windows WEF (Windows Event Forwarding) Framework.
- Understanding of network firewalls, load balancers and complex system designs.
- Proficiency with applicable Software Development Life Cycles (SDLC – such as ITSA, etc.).
- Expertise in various connector technologies (Kafka Connect, Elastic Beats, SmartConnectors, etc.) with strong Regex skills.
- Command of Python, Perl, SQL, Regex, Shell Scripting, and Java is preferred.
- Experience installing and maintaining open source log capture technologies such as Syslog-NG, Snare, LogStash, MSCOM, etc.
- One-plus (1+) years of demonstrable experience working with Elasticsearch is required. Must be knowledgeable in data analytics and establishing services that interact with Elastic, Hadoop, and Accumulo.
- One-plus (1+) years of demonstrable experience working with systems in a micro services framework.
- Excellent written and verbal communication skills.
Description of Work:
The Senior Data Brokering Engineer (Kafka) must be able to effectively contribute and leverage expert-level knowledge of Apache Kafka, data enrichment, DevSecOps, and IT/IS architectural design to ensure that operational and business goals are met. The Senior Data Brokering Engineer (Kafka) works closely with the prime and subcontractors to ensure that all coordinated efforts/pieces of the overall project stay within tolerances. The Senior Data Brokering Engineer (Kafka) must be capable of effectively analyzing the current and target states of the data collection architecture to ensure that effective and efficient data brokering, appropriate data enrichment, and data translation, is leveraged through productive implementation of the platform.
Duties and responsibilities include:
- Aligns with client needs and contract requirements to assess and analyze operational performance and identify opportunities for improvement and the enhancement of existing and relevant services.
- Partners with the client to evaluate existing data decimation strategies and technologies and tools, to facilitate the identification critical elements, weaknesses, and opportunities for improvement.
- Works independently and in coordination with others to architect solutions that have a measurable positive impact on security value, service management, and client satisfaction.
- Creates architecture diagrams, workflow models, and proposals/presentations presented to key stakeholders possessing a wide range of business, security and IT experience and capabilities.
- Delivers phased plans for architecting recommended solutions.
- Coordinates with the client and key stakeholders to gather requirements and design the solutions to support those requirements.
- Develops and creates reference architectures and models and related/required documentation.
- Architects solutions to drive scalability, sustainability, efficiency, and automation (including changes to staff/resources, processes, technologies, and tools).
- Supports the overall project through interactive client sessions to assist with implementation, support, and usage of multiple product vendors and technologies.
- Perform other duties as assigned.
- - - - -
Netorian offers a complete benefits package that includes medical/dental/vision, a 401(k), tuition assistance, paid holidays, and vacation. Netorian is an Equal Opportunity Employer (EOE): Minorities, Women, Veterans, and those with Disabilities.