Building a Full-Stack Dashboard for Drifter: A Step-by-Step Guide

Building a Full-Stack Dashboard for Drifter: A Step-by-Step Guide

Developing a full-stack application involves several critical components, from frontend UI development to data extraction through web scraping. This article explores how to approach a recruitment task for a full-stack developer position that requires both UI implementation and data scraping skills.

Understanding the Requirements

The recruitment process includes two main tasks:

  1. Implementing a desktop UI according to a Figma design
  2. Creating a web scraping solution to extract product data

Task 1: Dashboard UI Implementation

The first task involves creating a responsive dashboard that exactly matches a provided Figma design. The implementation should focus on:

  • Matching design specifications precisely
  • Using Next.js and React.js for frontend development
  • Deploying to a service like Vercel or Render
  • Pushing code to a public GitHub repository

Breaking Down UI Components

A well-structured approach involves dividing the UI into modular components:

  • Chart Component: Displays various analytics charts
  • Dashboard Component: Main container for all dashboard elements
  • Data Table Component: Presents tabular data
  • Matrix Card Component: Shows key metrics
  • Sidebar Component: Navigation and filtering options
  • Top Cities Component: Geography-based analytics

The dashboard features several key sections:

  • Overview section with sales and quantity metrics
  • Multiple chart visualizations
  • SKU-level data presented in table format
  • Filter options for data refinement
  • Side navigation panel

Deployment Process

After completing the implementation, the next step is deploying the application:

  1. Push your code to a GitHub repository
  2. Connect your repository to a deployment platform like Vercel
  3. Configure deployment settings
  4. Deploy and obtain a public URL

Task 2: Web Scraping Implementation

The second task involves identifying and using public APIs to extract product data from an e-commerce platform. The requirements include:

  • Identifying public API endpoints that deliver product data
  • Creating a scraping script to extract specified data points
  • Handling location-based parameters (latitude/longitude)
  • Processing category and subcategory data
  • Saving results to a CSV file

Scraping Approach

Two main methods can be used for data extraction:

  1. Direct API Requests: Using the requests library to make API calls
  2. Browser Automation: Using Selenium to navigate and extract data when direct API calls are restricted

The script needs to extract various product details including:

  • Category and subcategory information
  • Product variants and IDs
  • Pricing data (MRP and selling price)
  • Inventory information
  • Product images and brand details

Best Practices for Technical Assessments

When completing technical assessments for job applications, consider these best practices:

  1. Component-Based Architecture: Break down UI into logical, reusable components
  2. Proper File Naming: Use consistent naming conventions for better code organization
  3. Documentation: Comment your code and create a clear README
  4. Error Handling: Implement robust error handling in your scraping scripts
  5. Data Validation: Verify extracted data against the actual website

Following these guidelines will help create a comprehensive solution that demonstrates both technical proficiency and attention to detail, essential qualities for full-stack development roles.

Leave a Comment