ScrapeFlow is a powerful SaaS platform for workflow automation with integrated web scraping capabilities. Built on Next.js, FlowScrape lets users automate complex data extraction workflows, securely store credentials, manage billing, and monitor performance—all in one intuitive interface.
-
Workflow Automation: Easily build and execute multi-step workflows. Run tasks in distinct phases with assigned credits, providing granular control over your scraping executions.
-
Advanced Web Scraping Tools: Access a suite of scraping tools to design customized workflows tailored to different data needs, supporting automated actions and scheduled executions.
-
Credential Storage: Securely store API keys, tokens, and other sensitive information with encrypted storage, ensuring secure handling of credentials.
-
Billing System with Stripe: Effortlessly manage your billing and subscriptions with our Stripe integration, allowing for transparent usage tracking, subscription management, and billing history.
-
Intuitive UI and Analytics: Experience a clean, modern UI built with ShadCn, featuring real-time charts and reports for comprehensive monitoring of scraping performance and usage.
-
Secure Server-Side Handling: FlowScrape uses Next.js server actions for backend operations, ensuring secure processing of sensitive tasks.
-
AI-Powered Web Scraping (Beta): Explore our beta AI-driven feature that intelligently navigates and scrapes data from complex websites.
-
Integrated Chatbot Assistant: Get real-time help and guidance with our AI-powered chatbot that can assist you in building and troubleshooting your workflows.
-
Ethical Scraping Compliance: Automatic robots.txt checking to ensure your scraping activities comply with website policies and respect their terms of service.
- Sign Up: Create an account on FlowScrape and choose a subscription plan that suits your needs. FREE 100 credits are provided for first time users.
- Add Credentials: Securely store your API keys, tokens, and other credentials for seamless workflow execution.
- Build Your Workflow: Utilize FlowScrape's tools to design your workflow phases, and scrape the web with controlled execution.
- Monitor and Analyze: Track the performance of your workflows through real-time analytics, and manage your billing and usage directly on the dashboard.
- Frontend: Next.js, Tailwind and ShadCn for UI components
- Backend: Secure server-side processing with Next.js server actions
- Billing: Stripe integration for payment processing
- Security: Encrypted credential storage to protect sensitive data
- Analytics: Real-time data visualization and reporting
- AI Integration: OpenAI-powered chatbot and scraping assistance
- Ethical Compliance: Automated robots.txt parsing and compliance checking
FlowScrape's credit-based system allows you to manage workflow executions efficiently. The Stripe integration provides transparent billing, letting you track usage and manage subscriptions.
-
Clone the repository
git clone https://github.com/Amishmathur1/ScrapifySAAS cd ScrapifySAAS -
Install dependencies
npm install
-
Set up environment variables Create a
.envfile in the root directory and add the following:OPENAI_API_KEY=your_openai_api_key DATABASE_URL=postgresql://username:password@localhost:5432/yourdb NEXTAUTH_SECRET=your_nextauth_secret NEXTAUTH_URL=http://localhost:3000
-
Run database migrations
npx prisma migrate dev
-
Start the development server
npm run dev
The app will be available at
http://localhost:3000.
- Use NextAuth.js to sign up or log in to your account.
- Drag and drop nodes to define scraping tasks.
- Use AI suggestions for selector optimization.
- Get real-time assistance from the integrated chatbot.
- Securely store website login credentials if required.
- Use the scheduling feature to automate scraping tasks.
- The system automatically checks robots.txt before execution to ensure compliance.
- Use the integrated chatbot to get help with:
- Workflow optimization
- Selector suggestions
- Troubleshooting
- Best practices for ethical scraping
- Start development server:
npm run dev - Build for production:
npm run build - Run production server:
npm start
- Lint code:
npm run lint - Format code:
npm run format
- Add support for multi-step scraping workflows.
- Integrate more export formats (e.g., Google Sheets, Excel).
- Enhance AI capabilities for broader use cases.


