As a Backend Engineer at Synup, you will be responsible for working on features that directly impact users. You will:
• Work with your Technical Manager to ensure a feature implementation is scalable for the next million businesses that will use Synup.
* Work in cross-functional teams to build internal tools.
* Analyze & build tools to debug sub-systems.
* Building reporting mechanisms to ensure a sub-system's performance.
* Work with the Infrastructure Engineers to ensure smooth deployments.
* Get mentored by some of the finest folks in Bangalore The kind of people we are looking for:
• You have 4-10 yrs of work experience in Ruby on Rails.
* An eye for detail in your work.
* Be proactive and interested in solving problems for others.
* Be able to explore and pickup new tools & languages.
Our codebase is mostly in Ruby, But we have a few sub-systems using Python, Elixir and GoLang. Projects you could work on Build GraphQL endpoints for every customer facing component or move existing APIs from REST to GraphQL. With just under a billion pages crawled for 300 million unique listings, we have built a state of the art crawling system that powers our free scan product. If you like thinking about distributed systems, you might find a good home here! Design, build and maintain integrations with 50 different partners. We are constantly expanding our network and continue to onboard a partner every month. While most of our integrations are straightforward, we do have some convoluted head spinning quirks in our systems. Help us simplify them. Integrate our payments partner into our sales pipeline back into our lead generation tool. We believe in automating as many manual, repetitive tasks so that our team gets more productive and uses their time in deriving insights from the vast data that we gather. Plan and implement multi-region availability for our distributed job queuing infrastructure! All of our systems can sustain losing machines, and making our systems even more resistant to failure is a big theme for us. We have a lot of batch jobs that run every day. How would you plan and build data pipelines, workflows and reporting for these batch jobs?