Developer III Specialized (AWS, Python/PySpark)
Location: Reston, VA
In this role you will have the flexibility to make each day your own, while working alongside people who care so that you can deliver on the following responsibilities:
- Work with product owners and other development team members to determine new features and user stories needed in large/complex development projects
- Create or Update documentation in support of development efforts. Documents may include detailed specifications, implementation guides, architecture diagrams or design documents.
- Participate in code reviews with peers and managers to ensure that each increment adheres to original vision as described in the user story and all standard resource libraries and architecture patterns as appropriate.
- Respond to trouble/support calls for applications in production in order to make quick repair to keep application in production.
- Serve as a technical lead for an Agile team and actively participate in all Agile ceremonies. Participate in all team ceremonies including planning, grooming, product demonstration and team retrospectives. Identify and remove roadblocks
- Mentor or provide technical guidance to less experienced staff; may use high end development tools to assist or facilitate development process.
- Lead projects/initiatives end to end with little supervision
- Leverage Fannie Mae DevOps tool stack to build, inspect, deploy, test and promote new or updated features. May serve as technical lead, architect, project lead or principle developer in course of large or complex project.
- Expert proficiency in unit testing as well as coding in 1-2 languages (e. g. Python, Java, etc.).
- Have expert proficiency in Object Oriented Design (OOD) and analysis. Expert proficiency in application of analysis/design engineering functions.
- Expert proficiency in application of non-functional software qualities such as resiliency, maintainability, etc. Expert proficiency in advanced behavior-driven testing techniques. Provide expertise for teams in all matters related to deployment, building and release process.
Minimum Required Experiences:
- 8+ years of relevant experience
Specialized Knowledge and Skills:
- 3 or more years of experience with Spark and Amazon EMR/Hadoop, PySpark, SQL and NoSQL
- 2+ years of experience in building and deploying applications in AWS (S3, Lambda, Elastic BeanStalk, Hive, Glue, Redshift, RDS, Cloudwatch, SNS, SQS, Kinesis etc.)
- Experience processing large amounts of structured and unstructured data
- Experience in developing web applications and services using language Java, Spring Boot etc.