Skip to content

Commit

Permalink
updated readme and github action
Browse files Browse the repository at this point in the history
  • Loading branch information
jakopako committed Jan 11, 2022
1 parent 0461977 commit 24edd47
Show file tree
Hide file tree
Showing 3 changed files with 6 additions and 2 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,5 +31,5 @@ jobs:
env:
API_USER: ${{ secrets.API_USER }}
API_PASSWORD: ${{ secrets.API_PASSWORD }}
CRONCERT_API: ${{ secrets.CRONCERT_API }}
EVENT_API: ${{ secrets.EVENT_API }}

4 changes: 4 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,10 @@ Currently an event has the following fields:

Have a look at the configuration file `config.yml` for details about how to configure the crawler for a specific website.

## Run the crawler

The crawler can be executed with `go run main.go` to crawl all configured locations and print the results. To run a single crawler add the flag `-single`. To write the events to the event api add the environment variables `API_USER`, `API_PASSWORD` and `EVENT_API` and add the flag `-store` to the go command.

## Regular execution through Github Actions

The crawler is regularely being executed through Github Actions and its crawled data consequently written to the event api described below.
Expand Down
2 changes: 1 addition & 1 deletion main.go
Original file line number Diff line number Diff line change
Expand Up @@ -293,7 +293,7 @@ func extractStringRegex(rc *RegexConfig, s string) (string, error) {
}

func writeEventsToAPI(c Crawler) {
apiUrl := os.Getenv("CRONCERT_API")
apiUrl := os.Getenv("EVENT_API")
client := &http.Client{
Timeout: time.Second * 10,
}
Expand Down

0 comments on commit 24edd47

Please sign in to comment.