Senior Data Engineer

  • Tokyo
  • Remote OK
  • Full-time
  • November 14, 2024
Conditions
location-icon
Apply from Anywhere 👍
visa-icon
Relocation to Japan 👍
(Overseas visa sponsorship supported)
Requirements
language-icon
Language Requirements
Japanese: Conversational
English: Business Level
career-icon
Minimum Experience
Senior or above

<About AlpacaTech>

AlpacaTech offers a wide range of technological solutions for asset management, trading and market making that leverage state of the art AI to provide our clients a competitive edge in the industry.

**AlpacaTech** (www.alpaca-tech.ai), formerly the AI Solutions division of AlpacaJapan, is Asia’s leading AI FinTech based in Tokyo and specialized in building AI technologies for financial markets. We are providing short-term and long-term forecasting signals for a wide range of assets, as well as asset allocation recommendations for investment managers.

Our groundbreaking technology has already attracted the attention of major banking and financial institutions within Japan and around the world and led to long-lasting partnerships.

 

<Department and Responsibilities>

Department: AI Division

As a Data Engineer, your role will be to build and maintain our services/systems, to collaborate with the data scientists & system engineers to improve their productivity and create new products together, as well as build and maintain historical and live data pipelines with a focus on quality and performance. 

Your responsibilities will include:

- Designing scalable, maintainable, and robust live production systems using data engineering, and domain knowledge in financial data
- End-to-end ownership of our data pipelines, from requirements gathering to design,  deployment, monitoring, maintenance, and shutdown
- Contributing to the DevOps of AlpacaTech products including improvement of existing products
- Building low-latency production environments for handling high frequency streaming data
- Collaborating with the data scientists to improve their productivity: collect system and data requirements, create roadmaps and build the data platform used for research
- Building a scalable data lake and data pipelines with a focus on ease of use and clarity
- Prospecting, negotiating, proposing, and onboarding of new data sources and providers, and ensuring compliance to data-usage licenses
- Driving innovation by evaluating new technologies that add value to our products
- Collaborating with the team to iterate quickly on products and share knowledge

Preference will be given to candidates who have an interest and ability to handle cloud infrastructure responsibilities.

※Scope of change: Business determined by the company

Attraction and Rewards

You will be working in the AI department, which is our core competency.

The AI department is a group of talented data scientists, 90% of whom are foreign nationals, and you will be able to take advantage of their expertise and English language skills.

You will be able to propose and implement our AI solutions, which have been well proven at major financial institutions.

The organization is very flat and you will be able to communicate easily with executives and managers.

We highly value career paths and ease of work. We are working on a global standard.

 

<Qualifications>

Minimum Qualifications:

-Degree in Computer Science or related field, with 5 years of relevant experience
- Strong structural knowledge of computer science from low to high layer
- Fluent with python stack: Pandas, Numpy
- Familiar with:
    - handling data in many forms
    - parsing and transforming data safely and with high performance
    - Big Data and out-of-core processing
    - scaling data pipelines and programmatic ETL
    - containerization, containers orchestration tools, DevOps, and deployment
    - general engineering workflow (requirements, design, implementation, testing, monitoring)
    - good software engineering practices (coding, review, testing, git, safe deployment, legacy code changes, replicating environments)
    - cloud services such as AWS, GCP, Azure, etc
    - workflow engines such as ArgoWorkflow, Airflow, Prefect, etc
- Understand:
    - the challenges associated with data flow (import, processing, export)
    - the difficulties in pre-processing, post-processing, and synchronization of data and jobs
    - what a DAG is, and why it’s helpful in terms of job design and recovery from failure
    - principles of monitoring/detecting issues in online production systems
    - pros-cons of database vs data lake, and when it is appropriate to use different cloud-based storage solutions
    - timestamp transformations, timezone awareness
-Basic Japanese language skills (reading, writing)

 

**Preferred Qualifications:**

- Business Japanese language proficiency, capable of engaging in conversations and delivering presentations
- Knowledge of several programming languages: Golang, Rust, Java, etc
- Experience with:
    - handling financial data: Forex, Stocks, ETFs
    - handling streaming data and protocols
    - building data-centric products: Big Data, Search or Data Analytics platforms
    - data investigation and analysis
    - data scouting and communication with data providers
    - application/resource monitoring with DataDog
    - ML packages/frameworks (sklearn, PyTorch, Tensorflow, Chainer, etc)
    - project management, demonstrating hands-on experience in planning, organizing, and executing projects effectively
    - team management, including the ability to lead, motivate, and coordinate team members to achieve project goals
    - Kubernetes, both application deployment and cluster management
    - AWS services (such as RDS, S3, Athena, EC2, EKS, Cloudwatch, Lambda, VPCs, DynamoDB, Redshift, etc)
    - creating AWS resources with Terraform
- Understanding of data management, data quality monitoring and metadata management
- Has a security-minded approach when creating AWS cloud resources. Well-versed with IAM, Security Groups, secrets management, etc

**Soft skills:**

- Excellent team player with high integrity and a desire to help other team members succeed
- Strong documentation, organization, and English skills - we love to write, read, and collaborate asynchronously
- Loves to learn and is open-minded
- Can see both the small-picture (detail oriented) and big-picture (how their work fits into and affects the company)
- Open to and wants feedback, has a desire to always improve themselves and their results
- Prepared to take on multiple roles and responsibilities as needed, as we are a small company that often requires flexibility and adaptability

 

<Corporate Culture>

Team Diversity:

We are a collaborative team of engineers, data scientists, quants, architects, designers, bankers and project managers from diverse backgrounds.

Community Engagement:

We benefit from the community and are happy to give back. We cherish connections from our external partners, investors and players from the start-up industry. We have recently open-sourced our high-performance time series database MarketStore (https://github.com/Alpacahq/marketstore) written in Go.

Language:

Our main language is English while Japanese is also used depending on the business segment and job role involved. Internal communication tools include Slack, Google, Zoom.

Work-life Balance:

We believe and honor the fact that beyond an employee’s working life, parenting, family and friends are all very important aspects of a person’s life. We allocate and encourage remote work time to allow for a fulfilling work-life balance.

 

<Others>

- Contract type: Full-time position
- Location: Tokyo, Japan, close to metro station Otemachi with a hybrid of remote working

※Scope of change: Locations determined by the company (including locations for telework)

- Compensation and Allowance: Salary, Remote Work Allowance, Commutation Allowance paid for days working in the office
- Benefits & Welfare: Employee pension insurance, health insurance, employment insurance, membership to welfare service
- Holidays: Saturday, Sunday, Japanese national holidays and other days specified by the company such as December 31st ~ January 3rd
- Paid Leave: 10 or 20 days for the first year and 20 days per year thereafter
- Working hours: 9:00 ~ 18:00: can be flexible upon agreement with your manager
- Language Requirement: English
- Visa Sponsorship: Available

AlpacaTech is a technology company that develops investment management and trading solutions.

We combine cutting-edge technology and financial know-how to deliver sophisticated products that meet our clients' high standards.

View AlpacaTech's company page

↑ Back to top ↑

Senior Data Engineer at AlpacaTech
APPLY NOW  ➜