A rebuild of the nexpo backend with C#, currently in use for the ARKAD career fair as of 2023.
This is the backend that supports the Nexpo app and in the future hopefully the new nexpo website.
Google Drive: Stores meeting protocols and image source files of the figures used in the wiki. (Requires a tlth google account to access)
Trello Board: Provides an overview of all issues, their progress and who’s responsible. (Requires access)
**Small footnote to whoever is updating this table of contents:
The main configuration methods for setting up the development environment
Install Docker
chmod +x runBackend.sh
sudo ./runBackend.sh
(Note, much of these solutions are also relevant to the standalone solution)
sudo
in front of sudo ./runBackend.sh
sudo
in front of docker commandslsof -iTCP -sTCP:LISTEN -n -P
pkill ControlCenter; nc -l 5000
, and kill the command after a few secondsdocker -v
dotnet --version
The configured dotnet version of the backend is not the same as your downloaded dotnet version
dotnet --version
<TargetFramework>net7.0</TargetFramework>
wsl -v
If none if the above fixes the problem try making an standalone launch using the following command:
sudo ./runBackend -s
</details>
sudo docker run -d --name nexpo_database -p 5432:5432 -e POSTGRES_USER=nexpo -e POSTGRES_PASSWORD=nexpo postgres:14
dotnet run --project Nexpo
Open the solution (the code) in Visual Studio
Nexpo.sln
.docker-compose
“project” is selected as the startup project
docker-compose
in the Solution Explorer and select “Set as Startup Project”.The overall architecture can currently be split into 6 components with different responsibilities. They are as follows:
1. Controller:
Receives and responds to http-requests by calling on appropriate methods in the other components to generate the desired outcome. To control the format of the input and output, may requests and db responses be converted to DTO:s before being forwarded to repositories or sent as a response.
2. Repository:
Responsible for translating requests into queries against the model and converting query results to relevant data objects before returning them.
3. Model:
C# representation of the the database tables.
4. DTO:
Data Transfer Object that converts data to an object consisting of only relevant data. Can be used to prevent data leakage in http-responses or as an simplified method of moving data between different components.
5. Services:
Responsible for functionality outside the manipulation and gathering of data in the database. This entails token & file management, password validation and email services.
6. Helpers:
Consists of helper functions for the controller. Currently only converts claims to intelligible data.
1. UploadToDB: Responsible for download and uploading files to the database.
2. Scripts:
There are also Bash and Bat scripts for running the tests. (More information regarding is given when running the corresponding help command, eg: ./runTest.sh -help
These scripts have the ability to:
Start the database in a docker container
Run all tests
Run some of the tests, by specifing the class or controller to test. 3. Tests
There are currently more than 200 tests, which might be demanding for the computer. Therefore the ./runTest.sh -help
decalres ways to minimize this issue. Namely, by only running a fraction of the tests:
Options:
-start Reset the database before running tests
This needs to be done when the database seeding has changed
The first time this is done, the tests will fail
Ergo, run needs to be run twice
Run twice after -start:
-run Run all tests
-run <class_name> Run the specified test class
-run <controller_name> Run the test class for the specified controller
-run <name> Run the test class for the specified controller
Architecting-Modern-Web-Applications-with-ASP.NET-Core-and-Azure.pdf
NET-Microservices-Architecture-for-Containerized-NET-Applications.pdf
Current relations between the tables in the model:
User(Id, Email, PasswordHash, Role, FirstName, LastName, PhoneNr, FoodPreferences, hasProfilePicture, hasCV, ProfilePictureUrl, CompanyId)
Student(Id, Programme, ResumeEnUrl, ResumeSvUrl, LinkedIn, MasterTitle, Year, UserId)
Company(Id, Name, Description, DidYouKnow, LogoUrl, Website, HostName, HostEmail, HostPhone, DesiredDegrees, DesiredProgramme, Positions, Industries)
StudentSessionTimeslot(Id, Start, End, Location, StudentId, CompanyId)
StudentSessionApplication(Id, Motivation, Status, Booked, StudentId, CompanyId)
Event(Id, Name, Description, Date, Start, End, Location, Host, Language, Capacity, TicketCount)
Ticket(Id, Code, PhotoOk, isConsumed, EventId, UserId)
1.
sudo docker run -d --name nexpo_database -p 5432:5432 -e POSTGRES_USER=nexpo -e POSTGRES_PASSWORD=nexpo postgres:14
2.
sudo dotnet test Nexpo.Tests/
No matter the chosen setup method, it´s required to start an external database server before running the tests for them to pass. This is due to some tests utilizing black-box testing through testing against the controllers. It may take a while for the container to populate the tables with the example data, so if most controller-tests fail during the first run try to run them again.
NOTE: Running docker-compose in Visual Studio and then the tests do for some reason not work. To run the tests:
1.
sudo docker run -d --name nexpo_database -p 5432:5432 -e POSTGRES_USER=nexpo -e POSTGRES_PASSWORD=nexpo postgres:14
2.
sudo dotnet test Nexpo.Tests/
Swagger allows you to see the specifications of an API, including the endpoints, request parameters, response formats, and authentication methods. It provides an user interface for interacting with the API and testing its functionality. With Swagger, you can visualize and document your API in a standardized way, making it easier for developers to understand and use your API.
To open the application in swagger:
port 80 being used… service nginx stop
The classes stored in the Models
directory are the skeleton for the database. We use something called Code First to define our database structure in code relationships and then generate the database modifications automatically.
If you need to change a Model (adding a new field for example) or create a new one it is as simple as creating a new class in the Models directory. If you are adding a new Model you need to add the supporting classes (the repositories) as well and add the correct bindings that make them a new table. Take a look at how the others are connected, it should only be about two lines of code in ApplicationDbContext to add it as a table and then in Startup to add the repository to the dependency injector.
After the code changes you have to generate the database update code, the so called Migration. A migration is a small script that transforms the database from the old state to the new to match what you added or removed in the code. For example, if you added a field to a Model, the transformation will generate the necessary SQL to alter the table to include the new column. Migrations can also be reverted since they contain the information to undo the changes. To create a new migration for your changes, run the following command:
dotnet ef migrations add <a descriptive name for your changes>
After the migration is created you can take a look at the generated code to see if it is correct or if you need to make some changes (to the Model, don’t edit the generated migration). The migration is applied on development server startup but if you want to apply it manually you can run:
dotnet ef database update
To create a backup dump of the Postgres database, run the following shell command:
docker exec -t name_of_db_container pg_dumpall -c -U nexpo > name_of_dump.sql
NOTE: Store the dump file on a secure and private instance as it contains sensitive data about the content of the db.
To restore the db from a backup dump, run the following shell command:
cat name_of_dump.sql | docker exec -i name_of_db_container psql -U nexpo
To generate a script that applies generated migrations to the db, run the following shell command:
dotnet ef migrations script --idempotent > name_of_migration.sql
NOTE: Remove the two first lines in the generated script as they are simply just output from the build-process.
To apply the migration to the production db, use the generated script and run the following shell command:
cat name_of_migration.sql | docker exec -i name_of_db_container psql -U nexpo
The backend is currently hosted on AWS EC2 as three docker containers. Additionally, profile pictures and CVs are stored in AWS S3.
There are several ways that the database can be updated.
The most straightforward one being simply doing it in the EC2 instance. Since the database is hosted in EC2, PostgreSQLcommands can be used to update the database. If you have found the database… good. If not, go to EC2 and enter
docker exec -it <CONTAINER ID OF postgres:14> bash
psql -W nexpo -U nexpo
The backend currently running is available in Github at /home/ec2-user/nexpo/nexpo-backend-nova
It is updated by pulling the updated Github repo. The keys are (obviously) not in the Github repo, but should automatically be filled in. Make sure that this is the case. Also remember to migrate:
dotnet ef migrations script
dotnet ef database update
Then update the containers:
docker-compose -p nexpo build
docker-compose -p nexpo up -d
docker ps