Please note that this site has updated features that do not run on older versions of Internet Explorer.
For an optimal experience, please use another browser or the most recent version of IE.
Job Description
• Work with business partner to enable the world class compliance solutions in PayPal.
• Design and develop sophisticated big data solution like large scale graph processing, interactive data analysis.
• Work with the top engineering talents to solve the tech challenges in enterprise data domain including high throughput, low latency and high availability or stability using the cutting-edge big data technology e.g. Spark, Flink, HBase and so on.
• Code is well-commented, easy to maintain, and can be reused across a sub-system or feature. Code may persist for the lifetime of a software version
• Code is thoroughly tested with very few bugs and is supported by unit tests.
Beginning to lead feature or sub-system design reviews and code reviews and be recognized as the go-to developer for that component.
• Recognized as the go-to developer for a product or major sub-system and is seen as a leader in their specialized field.
• Leads feature or component design reviews and code reviews and is fully recognized as the go-to developer for that component.
• Participates in architecture discussions, proposes and discusses solutions to system and product changes that are directly related to their area of focus.
• Responsible for managing multiple Applications, providing necessary support and
maintenance activities.
• Should be comfortable working in an agile environment and with cross-functional teams, should have appetite to learn and be flexible to pick up new technology.
Requirements
• EE/CS or related majors. Bachelor and above.
• 5+ years working in data engineer or a similar role in large enterprise data warehouse teams preferred.
• Can be relied on to deliver a data solution on time and to requirements, without data quality issues.
• Leads the team, and contributes effectively to the success of those that they interact with regularly.
• Can triage and resolve data / process issues
• Able to evangelize best practices through prototyping or other means.
• Helps to resolve site data base issues and SLA, ATB impacts.
• Good understanding of Data Modelling Concepts, experience with modelling data and metadata to support relational & non-relational database implementations; experience with building Logical and Physical data models.
• Familiar with various big data technologies, open source data processing frameworks.
• Evaluates and implements data solutions with various big-data technologies.
• Good understanding of data processing, data structure optimization and design for
scalability.
• Optimize Data processing of extremely large datasets; Optimization for SLAs
• Good understanding of REST-oriented APIs , understanding of distributed systems, data streaming, NoSQL solutions for creating and managing data integration pipelines for Batch and Real Time Data needs.
• Excellent communication and team player skills to collaborate with cross functional teams for successful delivery of Data solutions.
• Understanding of version control systems, particularly GIT.
• Strong analytical and problem solving skills.
• Good understanding of database principles and SQL beyond just data access.
• Expert in Unix/Linux Shell Scripting, Java(or Scala)
• Knowledge on OLAP system is a big plus
• Knowledge on Calcite is a big plus
• Intermediate level knowledge on following technologies, with expertise on few of them:
o Spark/Flink with Java or Scala
o HBase
o Hive
o Elastic search
o Druid
o Scripting(Shell, Python, Java)
o Database fundamentals(Oracle or Teradata)
• Intermediate level knowledge on following business domains is a plus: o Marketting
o Customer Service
o Payments
o Merchant servicing
o e-commerce
• Excellent oral and written communication skills in English.