ETL Process Guidance for Data Engineering Interviews (10 + years)
₹750-1250 INR / hour
Zavřený
Zveřejněno před 3 měsíci
₹750-1250 INR / hour
As someone preparing for data engineering interviews, I require expert guidance especially in the area of ETL processes. I need to focus on:
- This is an interview support role, You are supposed to help in live interviews.
• Extraction techniques – The primary data sources of my interest are platforms like Spark, AWS, Azure, GCP, and Hive. I want to understand effective methods for data extraction from these particular sources.
Ideal Skills and Experience:
- Expertise in ETL tools for data extraction
- Hands-on experience with Spark, AWS, Azure, GCP, Hive
- Profound knowledge in data engineering
- Experience in career coaching or mentoring will be a bonus
- SQL
-Python
This assistance will give me a competitive edge in my upcoming interviews by providing me with practical skills, knowledge, and confidence in addressing ETL-related questions. So, a mentor with over 10 years of experience in data engineering, particularly in ETL processes and in touch with modern data extraction techniques, would be ideal.
Hi,
I am experienced data Engineer and currently working in python sql andETL(airflow) with AWS you can reach out to me for assistance for learning DE
Looking forward to discuss how I can help
Thanks,
Avinash
Certified Solution Architect, Data Architect and For last 4 years working on Azure services. I work for a top US based MNC company. Worked for major Health Payer Projects. Having 18 years IT exp.
• Proactive and achievement-oriented professional with over 8+ years of rich experience as Azure Data Engineer.
• Exposure to building ELT architecture end to end.
• Keen analyst with skills in gathering and understanding requirements of clients & followed by translation into functional specifications as well as provisioning of suitable solutions.
• Experience in creating Databricks notebooks, Azure Functions, building ADF pipeline, Data/Delta lake, Azure Synapse SQL, Azure DevOps Git.
• Creation/Deletion/Cancel of adb jobs using Databricks REST APIs.
• Optimization of long running jobs, ADB notebooks, ADF pipelines, queries.
• Implemented Azure Devops CI/CD process.