Skip to content

Web scraper for scraping data from the web pages of the city portal with saving information in the csv-format file.

License

Notifications You must be signed in to change notification settings

Laguna1/Web-Scraper

Repository files navigation

Web Scraper in Ruby

Web Scraper (Ruby Capstone Project)

I created this project as a requirement to complete the Ruby section in the main Microverse technical program. To check the scraper, web site Mykharkov.info was used to collect data on the sights of the city, for each scraped the name, address, short description.

In this project

  • The following gems were used: Noko
  • The Pry debugger was used to check the values ​​of scraped-off data from the page.
  • Scraper results saved in a csv format file
  • Set up the code linter in the repository
  • Github flow used
  • Used RSpec to create test cases for methods of class public
  • Implemented the basic principles of OOP
  • Had an organized project structure
  • Common Ruby Patterns Used

CSV.file

Built With

  • Ruby
  • Rubygems
  • Nokogiri
  • Open-Uri
  • Pry
  • Rspec
  • VSCode

Getting Started

Clone this repository. In the terminal of your OS, find the storage address and run the batch installation It necessary to install the gems to run the program:

  • nokogiri
  • pry
  • csv
  • RSpec

💻 Use

To run the program, run 'ruby bin/​​main.rb'. Follow the instructions.

CSV.file

📝 Run tests

Used RSpec as a testing tool.

In the root folder, run the 'rspec' or 'rspec --format documentation' to run the tests.

Tests

👤 Oksana Petrova

🤝 Contributing

Contributions, issues, and feature requests are welcome!

Feel free to check the issues page.

Show your support

Give a ⭐️ if you like this project!

📝 License

This project is MIT licensed.

About

Web scraper for scraping data from the web pages of the city portal with saving information in the csv-format file.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages