gitea/integrations
zeripath cb9c8184c9
Make Repo Code Indexer an Unique Queue (#17515)
The functioning of the code indexer queue really only makes sense as an unique queue
and doing this allows use to simplify the indexer data to simply delete the data if
the repo is no longer in the db.

Signed-off-by: Andrew Thornton <art27@cantab.net>
2021-11-02 11:14:24 +08:00
..
gitea-repositories-meta
migration-test
README.md
README_ZH.md
admin_user_test.go
api_admin_org_test.go
api_admin_test.go
api_branch_test.go
api_comment_test.go
api_fork_test.go
api_gpg_keys_test.go
api_helper_for_declarative_test.go
api_issue_label_test.go
api_issue_milestone_test.go
api_issue_reaction_test.go
api_issue_stopwatch_test.go
api_issue_subscription_test.go
api_issue_test.go
api_issue_tracked_time_test.go
api_keys_test.go
api_nodeinfo_test.go
api_notification_test.go
api_oauth2_apps_test.go
api_org_test.go
api_private_serv_test.go
api_pull_commits_test.go
api_pull_review_test.go
api_pull_test.go
api_releases_test.go
api_repo_edit_test.go
api_repo_file_create_test.go
api_repo_file_delete_test.go
api_repo_file_helpers.go
api_repo_file_update_test.go
api_repo_get_contents_list_test.go
api_repo_get_contents_test.go
api_repo_git_blobs_test.go
api_repo_git_commits_test.go
api_repo_git_hook_test.go
api_repo_git_notes_test.go
api_repo_git_ref_test.go
api_repo_git_tags_test.go
api_repo_git_trees_test.go
api_repo_languages_test.go
api_repo_lfs_locks_test.go
api_repo_lfs_migrate_test.go
api_repo_lfs_test.go
api_repo_raw_test.go
api_repo_tags_test.go
api_repo_teams_test.go
api_repo_test.go
api_repo_topic_test.go
api_settings_test.go
api_team_test.go
api_team_user_test.go
api_token_test.go
api_user_email_test.go
api_user_heatmap_test.go
api_user_org_perm_test.go
api_user_orgs_test.go
api_user_search_test.go
api_wiki_test.go
attachment_test.go
auth_ldap_test.go
benchmarks_test.go
branches_test.go
change_default_branch_test.go
cmd_keys_test.go
compare_test.go
cors_test.go
create_no_session_test.go
delete_user_test.go
download_test.go
editor_test.go
empty_repo_test.go
eventsource_test.go
explore_repos_test.go
git_clone_wiki_test.go
git_helper_for_declarative_test.go
git_smart_http_test.go
git_test.go
goget_test.go
gpg_git_test.go
html_helper.go
integration_test.go
issue_test.go
lfs_getobject_test.go
lfs_local_endpoint_test.go
links_test.go
migrate_test.go
mirror_pull_test.go
mirror_push_test.go
mssql.ini.tmpl
mysql.ini.tmpl
mysql8.ini.tmpl
nonascii_branches_test.go
oauth_test.go
org_count_test.go
org_test.go
pgsql.ini.tmpl
private-testing.key
privateactivity_test.go
pull_compare_test.go
pull_create_test.go
pull_merge_test.go
pull_review_test.go
pull_status_test.go
pull_update_test.go
release_test.go
rename_branch_test.go
repo_activity_test.go
repo_branch_test.go
repo_commits_search_test.go
repo_commits_test.go
repo_fork_test.go
repo_generate_test.go
repo_migrate_test.go
repo_search_test.go
repo_tag_test.go
repo_test.go
repo_watch_test.go
repofiles_delete_test.go
repofiles_update_test.go
setting_test.go
signin_test.go
signout_test.go
signup_test.go
sqlite.ini.tmpl
ssh_key_test.go
testlogger.go
timetracking_test.go
user_avatar_test.go
user_test.go
version_test.go
view_test.go
xss_test.go

README.md

Integrations tests

Integration tests can be run with make commands for the appropriate backends, namely:

make test-mysql
make test-pgsql
make test-sqlite

Make sure to perform a clean build before running tests:

make clean build

Run all tests via local drone

drone exec --local --build-event "pull_request"

Run sqlite integrations tests

Start tests

make test-sqlite

Run mysql integrations tests

Setup a mysql database inside docker

docker run -e "MYSQL_DATABASE=test" -e "MYSQL_ALLOW_EMPTY_PASSWORD=yes" -p 3306:3306 --rm --name mysql mysql:latest #(just ctrl-c to stop db and clean the container)
docker run -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" --rm --name elasticsearch elasticsearch:7.6.0 #(in a second terminal, just ctrl-c to stop db and clean the container)

Start tests based on the database container

TEST_MYSQL_HOST=localhost:3306 TEST_MYSQL_DBNAME=test TEST_MYSQL_USERNAME=root TEST_MYSQL_PASSWORD='' make test-mysql

Run pgsql integrations tests

Setup a pgsql database inside docker

docker run -e "POSTGRES_DB=test" -p 5432:5432 --rm --name pgsql postgres:latest #(just ctrl-c to stop db and clean the container)

Start tests based on the database container

TEST_PGSQL_HOST=localhost:5432 TEST_PGSQL_DBNAME=test TEST_PGSQL_USERNAME=postgres TEST_PGSQL_PASSWORD=postgres make test-pgsql

Run mssql integrations tests

Setup a mssql database inside docker

docker run -e "ACCEPT_EULA=Y" -e "MSSQL_PID=Standard" -e "SA_PASSWORD=MwantsaSecurePassword1" -p 1433:1433 --rm --name mssql microsoft/mssql-server-linux:latest #(just ctrl-c to stop db and clean the container)

Start tests based on the database container

TEST_MSSQL_HOST=localhost:1433 TEST_MSSQL_DBNAME=gitea_test TEST_MSSQL_USERNAME=sa TEST_MSSQL_PASSWORD=MwantsaSecurePassword1 make test-mssql

Running individual tests

Example command to run GPG test:

For sqlite:

make test-sqlite#GPG

For other databases(replace MSSQL to MYSQL, MYSQL8, PGSQL):

TEST_MSSQL_HOST=localhost:1433 TEST_MSSQL_DBNAME=test TEST_MSSQL_USERNAME=sa TEST_MSSQL_PASSWORD=MwantsaSecurePassword1 make test-mssql#GPG

Setting timeouts for declaring long-tests and long-flushes

We appreciate that some testing machines may not be very powerful and the default timeouts for declaring a slow test or a slow clean-up flush may not be appropriate.

You can either:

  • Within the test ini file set the following section:
[integration-tests]
SLOW_TEST = 10s ; 10s is the default value
SLOW_FLUSH = 5S ; 5s is the default value
  • Set the following environment variables:
GITEA_SLOW_TEST_TIME="10s" GITEA_SLOW_FLUSH_TIME="5s" make test-sqlite