Data Engineer Resume
Los Angeles, CA
PROFESSIONAL SUMMARY:
- Over 8 years of professional IT experience with strong emphasis in development and testing of software applications
- Over 3 years of comprehensive experience in Big Data, Hadoop Ecosystem
- Hands on experience with Big Data technology stacks using HDFS, Hive, Hbase, Flume, Kafka, Mapreduce, Oozie, Sqoop, Spark, Tez
- Proficient in analyzing, designing, developing and translating business requirements to technical requirements and good understanding of product development
- Deep experience in Data Warehousing to deliver Hybrid Data Engineering Architecture using MPP and Hadoop
- Broad set of technology skills with hands - on experience in wide variety of high volume systems to Terabyte class EDW and analytical systems
- Hands on with setting up and creating Data Visualizations using Tableau
- Adept in working with technical and non technical audience
- Excellent communication and documentation skills
TECHNICAL SKILLS:
Big Data: Hive, Sqoop, Flume, Kafka, Oozie, Tableau, Spark
Tools:, Mercury Tools, SQL Server Management Studio, Oracle SQL Developer, Quality Center, ALM, McKesson ClaimsXten & Clear Claims Connection, Amisys Advance, Facets
Operating System: MVS, UNIX, Windows XP/7
Scripting Languages: Python, Shell Scripting
Databases: SQL Server 2008 R2, Oracle 10/11g
PROFESSIONAL EXPERIENCE:
Confidential, Los Angeles, CA
Data Engineer
Responsibilities:
- Built critical data sets in hybrid data hub(Hadoop and Netezza EDW) for analytics and BI needs serving all of Confidential 's business units
- Managed complex projects and initiatives that significantly impact business results and require high degree of cross functional participation and coordination
- Instrumental in integrating Hadoop with Enterprise Data Warehouse, thereby saving 40% space in Netezza
- Extensive use of Sqoop and Hive Data Warehouse environment for ETL and ELT
- Assisted BI team with MicroStratergy setup using Hive for data analytics
- Worked with cross functional teams such as Data Science and Machine Learning on various data ingestion needs
- Created and automated data ingestion pipelines using Oozie and Shell
- Created Hive tables using multiple storage formats and compression techniques
- Used Hive to analyze partitioned and bucketed data and compute various metrics for reporting
- Used Oozie to build streaming workflows that listen to various Kafka topics, transform the data using Hive before loading into HDFS
- Involved in collecting and aggregating log data using Flume and staging it in HDFS for analysis
- Created multiple data pipelines that could handle 1TB of data
- Worked on various Python data structures including list, dictionaries
- Built data visualization dashboards using Tableau for real time monitoring and data analysis
Environment: Hadoop, HDFS, Informatica Power Center, Oracle, Hive, Sqoop, Flume, Kafka, Netezza Striper, Tableau, Agile, Jira, GitHub
Confidential, Quincy, MA
QA Analyst
Responsibilities:
- Created test data to enter the medical authorizations in Acuity for various lines of business
- Validated creation of claims, members, providers, claims re-pricing, adjudication, provider contracts, networks, reimbursement arrangements and member data in Amisys Advance
- Verified back-end processing of the claims between Amisys Advance and McKesson Total Payment Integration system audit tool
- Validated the editing logic for physician coding in McKesson's Clear Claim Connection
- Validated the Recommendation, Savings and Clear Claim Connection reports in McKesson’s ClaimsXten and paid feed sent to the ClaimsXten history database
Environment: Amisys 6.4.2, HP ALM, ClaimsXten 5.1, Clear Claim Connection, Acuity Advanced Care, Oracle 11g, Microsoft Office
Confidential, Durham, NC
QA Analyst
Responsibilities:
- Validated the Mobius reports(hipaa audit, ED101E), mainframe files for mapping and mainframe tables for modifiers and hold codes
- Keyed claims in FEPdirect (y-price and n-price) and resolved the deferrals and adjusted claims
- Validated the claims in PMHS and the claim response from DC (dfload)
- Validated the claims posting cycle after the claim response was sent to PMHS
- Verified the adjustments on the claims side and finance side of PMHS and the Mobius error reports for all the adjustments
Environment: PMHS7.2, HP Quality Center 10.0, Ultra Edit, Oracle 10g, Microsoft Office
Confidential, Minneapolis, MN
QA Analyst
Responsibilities:
- Primarily responsible for developing and maintaining test plans and cases for the UHG's highly visible Member EOB Migration system
- Responsibilities also included providing requirements analysis for the Coordination of Benefits, Primary Claim Calculation and 835 Bundling projects
- Extensively worked on EDI Transactions 270, 271, 276, 277, 834, 835, 837
- Tested HIPAA Claims processing through the Clearing house
- Extensively worked on the Provider and Member Explanation of Benefits Migration
Environment: MS Visual Studio 2010, HP Quality Center 10.0, Oracle 9.6.1.1,Ultra Edit, Microsoft Office