Job Openings >> Journeyman Data Brokering Engineer (Kafka)
Journeyman Data Brokering Engineer (Kafka)
Summary
Title:Journeyman Data Brokering Engineer (Kafka)
ID:4169
Location:Washington DC Metro Area
Clearance:Top Secret
Description

Netorian is seeking a Journeyman Data Brokering Engineer (Kafka).

The Journeyman Data Brokering Engineer (Kafka) is responsible for supporting the Sr. Data Brokering Engineer in architecting data brokering solutions to improve security value, service management, and scalability for our clients. The successful candidate will be a strong technologist with a practical yet creative mind. The successful candidate must be able to clearly and successfully communicate and demonstrate a clear understanding of business and technical requirements of the client and will be able to effectively collaborate with client information security and IT/IS teams as well as project and program managers to develop and deliver world class results.

In this role, the Journeyman Data Brokering Engineer (Kafka) reports to the Sr. Data Brokering Engineer as part of a small team. Additional responsibilities include assisting with the assessing, optimizing, and integrating the customer’s sensor array such that delivery of the data to the appropriate Kafka topic is as accurate and timely as possible. This is a client-facing role, working closely with client IT security and IT/IS capabilities in addition to Netorian management and team members. This role does not require significant travel.

Employment Type:

  • Full-time, exempt.

Compensation:

  • Competitive, based on qualifications and experience.

Location:

  • National capital region and Ft. Meade area. (remote work opportunity may be available after initial development is completed).

Travel:

  • 25% or less.

Security Clearance:

  • Must possess an active DoD Top Secret security clearance and be SCI eligible.
  • Must successfully pass a criminal background check and drug screening.

Education:

  • A bachelor’s degree in computer science, data science, or applicable field is required.
  • Four (4) years of demonstrable equivalent experience specific to data brokering, Apache Kafka, etc. may be considered as meeting the education requirement.

Certifications & Training:

  • CompTIA Security+ certification required.
  • CISSP certification is desired.
  • PMI PMP certification is desired.
  • ITiL Foundations certifications desired.

Experience & Skills:

  • Ability to rapidly understand client’s business strategies and the capability to apply creative problem-solving skills to deliver high-impact solutions to meet business needs is a must.
  • Knowledge and experience working with either Apache Kaka, MicroFocus Event Broker, or Confluent Enterprise Community Edition is required.
  • Demonstrable experience supporting a major open source project or initiatives involving security research or design/framework, information management, data and content modeling, and large data analytics is strongly desired.
  • Four-plus (4+) years of demonstrable applied technology experience in data brokering and DCO operations defining strategy around security monitoring, incident management, regulatory compliance, and process improvement is required.
  • Two-plus (2+) years of demonstrable design, engineering, configuration, and maintenance data brokering experience (Kafka) is required.
  • Demonstrable experience developing Kafka Streams applications is desired.
  • Demonstrable experience related to data enrichment services and tools is required.
  • Demonstrable experience working with LogStash is required.
  • Sensor grid architecture knowledge and expertise.
  • Unix / Linux knowledge and hands-on experience.
  • Familiarity with Cyber Kill Chain methodologies.
  • Familiarity with Windows WEF (Windows Event Forwarding) Framework.
  • Understanding of network firewalls, load balancers and complex system designs.
  • Demonstrable experience in various connector technologies (Kafka Connect, Elastic Beats, SmartConnectors, etc.) with strong Regex skills.
  • Command of Python, Perl, SQL, Regex, Shell Scripting, and Java is preferred.
  • Experience installing and maintaining open source log capture technologies such as Syslog-NG, Snare, LogStash, MSCOM, etc.
  • Demonstrable experience working with Elasticsearch is required. Must be knowledgeable in data analytics and establishing services that interact with Elastic, Hadoop, and Accumulo.
  • Demonstrable experience working with systems in a micro services framework.
  • Professional written and verbal communication skills.

Description of Work:

The Journeyman Data Brokering Engineer (Kafka) must be able to effectively contribute and leverage knowledge of Apache Kafka, data enrichment, DevSecOps, and IT/IS architectural design to ensure that operational and business goals are met. The Journeyman Data Brokering Engineer (Kafka) works closely with the Senior Data Brokering Engineer, team members, and external stakeholders to ensure that all coordinated efforts/pieces of the overall project stay within tolerances. The Journeyman Data Brokering Engineer (Kafka) must be capable of effectively analyzing the current and target states of the data collection architecture to support that effective and efficient data brokering, appropriate data enrichment, and data translation, is leveraged through productive implementation of the platform.

Duties and responsibilities include:

  • Aligns with Sr. Data Brokering Engineer priorities and contract requirements to assess and analyze operational performance and identify opportunities for improvement and the enhancement of existing and relevant services.
  • Supports the evaluation of existing data decimation strategies and technologies and tools to facilitate the identification critical elements, weaknesses, and opportunities for improvement.
  • Works independently and in coordination with others to complete tasks as assigned enabling the delivery of solutions that have a measurable positive impact on security value, service management, and client satisfaction.
  • Supports the creation of architecture diagrams, workflow models, and proposals/presentations presented to key stakeholders possessing a wide range of business, security and IT experience and capabilities.
  • Supports the development and creation of reference architectures and models and related/required documentation.
  • Supports the Sr. Data Brokering Engineer’s approach to driving scalability, sustainability, efficiency, and automation (including changes to staff/resources, processes, technologies, and tools).
  • Supports interactive client sessions as directed to assist with implementation, support, and usage of multiple product vendors and technologies.
  • Perform other duties as assigned.

- - - - -

Netorian offers a complete benefits package that includes medical/dental/vision, a 401(k), tuition assistance, paid holidays, and vacation. Netorian is an Equal Opportunity Employer (EOE): Minorities, Women, Veterans, and those with Disabilities.

This opening is closed and is no longer accepting applications
ApplicantStack powered by Swipeclock
Netorian, LLC
210 Research Boulevard, Suite 260J
Aberdeen, MD 21001
Info@Netorian.com
+1 (844) 638-6742
Netorian Logo
ISO logo

Committed to Quality - ISO 9001:2015 certified.