Key Responsibilities:
- Model Development & Integration
- Design and implement ML/AI models for NLP, speech recognition, recommendation systems, and predictive analytics.
- Fine-tune large language models (LLMs) and integrate with APIs (e.g., OpenAI, Anthropic, HuggingFace).
- Build and maintain RAG pipelines with vector databases.
- AI System Engineering
- Develop APIs and microservices (Python, FastAPI, Flask) to serve AI models.
- Implement conversation memory, context handling, and multi-turn dialogue.
- Optimize models for latency, cost-efficiency, and scalability.
- Data Engineering & Processing
- Build pipelines for data ingestion, cleaning, labeling, and transformation.
- Manage embeddings, knowledge bases, and structured/unstructured datasets.
- Conduct feature engineering and dataset preparation for supervised/unsupervised learning.
- Deployment & Monitoring
- Containerize AI services (Docker, Kubernetes) and deploy to cloud environments (AWS/GCP/Azure).
- Monitor model performance, drift, and accuracy with continuous retraining workflows.
- Collaborate with DevOps for CI/CD and model lifecycle management (MLOps).
- Collaboration
- Work closely with product managers, designers, and full-stack engineers to integrate AI features into products.
- Participate in code reviews, architecture discussions, and sprint planning.
- Stay updated with research in AI/ML and recommend new approaches.
Required Skills & Qualifications:
- Strong programming skills in Python and familiarity with ML/AI libraries (TensorFlow, PyTorch, scikit-learn, HuggingFace).
- Experience with LLMs, LangChain, RAG, embeddings, and conversational AI systems.
- Solid understanding of data structures, algorithms, and software engineering principles.
- Hands-on experience with cloud platforms (AWS, GCP, or Azure) for deploying ML solutions.
- Knowledge of databases: PostgreSQL, Redis, Vector DBs (Pinecone, Weaviate, FAISS).
- Experience with APIs, FastAPI/Flask, Docker, and Kubernetes.
- Strong analytical skills and the ability to translate business problems into technical solutions.
Nice to Have:
- Familiarity with speech-to-text (STT), text-to-speech (TTS), and telephony integrations (Twilio, Vonage, etc.).
- Exposure to MLOps tools (MLflow, Kubeflow, SageMaker).
- Contributions to open-source AI/ML projects.
- Experience in SaaS product development.
What We Offer:
- Opportunity to build cutting-edge AI solutions for global clients and SMBs.
- Collaborative and innovation-driven culture.
- Growth opportunities across AI, Cloud, and Cybersecurity practices.
- Flexible work environment and access to partner ecosystems (AWS, GCP, Neysa.ai, etc.).