RESTful Setup Instructions
We’ll be using a very similar structure to what we did in the DB project. We need to interact with PostgreSQL, but we also need a server to run in the background listening to requests and doing any logic we need.
These first few instructions will be exactly the same as the DB project, but they start to get slightly different for this project so follow along carefully. We will assume that you have done the DB project prior to this.
NOTE: There will be new groups and new partners for the REST project.
-
Log into https://git.gccis.rit.edu using your RIT username and password. Go to the group were were assigned. In your group, we have created a project for you called
rest-abc123whereabc123is your username. Note that you have access to one other students’ repositories. You are going to be pushing code to your own repository and making comments on other students’ repositories. (See our FAQ on working with other students’ code) -
Clone your project repository locally using your favorite Git client. (See our Git page for some helpful resources.)
-
You will need Python 3.9 installed on your system. Be sure to add
pythonto your PATH in installation. SWEN lab machines: If you are in the lab, this is pre-installed and available on the command line. -
Install PostgreSQL 14. SWEN lab machines: PostgreSQL is already installed on the SE lab machines.
-
Using the PostgreSQL admin console (pgAdmin), create a user called
swen610with a password of your choosing. This password will be stored in a file, so you don’t need to memorize it - you can mash the keyboard. Just keep that string because we’re about to put it in a file in a moment. Note: be sure to check the box for “User Can Login” on the Privileges tab. SWEN lab machines: this has been done for you. The password issalutecaptionearthyfight -
Still in pgAdmin, create a database also called
swen610and make the owner of it the userswen610. -
Now let’s create our project structure in our local repository. We’re going to explain every file, but let’s start by turning some of our directories into Python packages by creating a bunch of directories and empty
__init__.pyfiles. Please use these exact folder names. Your file structure should be this:rest-abc123/ // your username instead of abc123 __init__.py config/ // don't need an __init__.py here src/ __init__.py api/ __init__.py db/ __init__.py tests/ __init__.py api/ __init__.py db/ __init__.py -
We need to tell Git to not version control a bunch of files. Create
.gitignore(note the dot at the beginning) file at the root of the repository with the following contents:*.pyc **/db.yml -
Create a file
config/db.ymlthat has the following content. Change thereplacemepassword to the password you made a moment ago in the pgAdmin console. This is our configuration file that we’ll use to connect to our PostgreSQL database. Notice how our.gitignorefile tells Git not to commit this to the repository - which is good because passwords in repositories is a bad, bad thing.host: localhost database: swen610 user: swen610 password: replaceme port: 5432 -
Create a file
config/gitlab-credentials.ymlwith the following content. This is one of our configuration files for our CI server to run on GitLab. When we run our tests on the server, we’ll copy this over to replacedb.yml# This file doesn't need to be changed. It's the same for everyone host: postgres database: swen610 user: swen610 password: whowatchesthewatchmen port: 5432 -
Next is our
.gitlab-ci.yml. A key difference from the DB project is that in thebefore_scriptwe are running a file calledserver.py. That&is key here, too, as it tells the shell to run that job in the background and continue to the next command. Thesleepcommand gives the server an extra 3 seconds to boot up.image: name: kalrabb/docker-344-v2025:latest:latest services: - postgres:17 variables: POSTGRES_DB: swen610 POSTGRES_USER: swen610 POSTGRES_PASSWORD: whowatchesthewatchmen PYTHON_RUN: python3 before_script: - pip install -r requirements.txt - cp config/gitlab-credentials.yml config/db.yml - $PYTHON_RUN --version - $PYTHON_RUN src/server.py & # fire up the server before we run our tests - sleep 3 testrunner: script: - $PYTHON_RUN -m unittest -v # run the unit tests; -v prints the test being run stage: test -
Now we need to tell Python what packages we need. At the root of the repository make a file called
requirements.txtwith this content. Note: if you ever need to add new Python packages for your project, feel free to add them here and they will get installed upon every run of the CI.You should just run this once on your local device, to install the dependencies. Use
pip install -rNOTE: MacOS users should follow the guidance from the DB project re: psycopg2-binary. i.e. if you installed psycopg2-binary, don’t attempt to re-install it from requirements.txt.
-
Let’s create out test data set. Create a file called
src/db/test_data.sqlwith this content:-- We specify our primary key here to be as repeatable as possible INSERT INTO example_table(id, foo) VALUES (1, 'hello, world!'); -- Restart our primary key sequences here so inserting id=DEFAULT won't collide ALTER SEQUENCE example_table_id_seq RESTART 1000; -
And finally here is our schema file, in
src/db/schema.sqlDROP TABLE IF EXISTS example_table; CREATE TABLE example_table( id SERIAL PRIMARY KEY, foo TEXT NOT NULL ); -
We need our database layer that calls Postgres. Make a file called
src/db/example.pywith this content:import os from .swen610_db_utils import * def rebuild_tables(): exec_sql_file('src/db/schema.sql') exec_sql_file('src/db/test_data.sql') def list_examples(): """This is an example. Please remove from your code before REST1 deadline. DB layer call for listing all rows of our example. """ return exec_get_all('SELECT id, foo FROM example_table') -
You will also need our database utility in
src/db/swen610_db_utils.py(note it’s in the db directory now). Here is the contents for that:import psycopg2 import yaml import os def connect(): config = {} yml_path = os.path.join(os.path.dirname(__file__), '../../config/db.yml') with open(yml_path, 'r') as file: config = yaml.load(file, Loader=yaml.FullLoader) return psycopg2.connect(dbname=config['database'], user=config['user'], password=config['password'], host=config['host'], port=config['port']) def exec_sql_file(path): full_path = os.path.join(os.path.dirname(__file__), f'../../{path}') conn = connect() cur = conn.cursor() with open(full_path, 'r') as file: cur.execute(file.read()) conn.commit() conn.close() def exec_get_one(sql, args={}): conn = connect() cur = conn.cursor() cur.execute(sql, args) one = cur.fetchone() conn.close() return one def exec_get_all(sql, args={}): conn = connect() cur = conn.cursor() cur.execute(sql, args) # https://www.psycopg.org/docs/cursor.html#cursor.fetchall list_of_tuples = cur.fetchall() conn.close() return list_of_tuples def exec_commit(sql, args={}): conn = connect() cur = conn.cursor() result = cur.execute(sql, args) conn.commit() conn.close() return resultLet’s do a quick review of what this code above does:
connectwill connect you to Postgres via our config. Closing this connection is up to you.exec_sql_filewill open up an SQL file and blindly execute everything in it. Useful for test data and your schema. Having your code in a SQL file also gives syntax highlighting!exec_get_onewill run a query and assume that you only want the top result and return that. It does not commit any changes, so don’t use it for updates.exec_get_allwill run a query and return all results, usually as a list of tuples. It does not commit any changes, so don’t use it for updates.exec_commitwill run SQL and then do a commit operation, so use this for updating code.
-
Now we need to set up our server. Create a file called
src/server.py. Here are its contents:from flask import Flask from flask_restful import Resource, Api from api.hello_world import HelloWorld from api.management import * app = Flask(__name__) api = Api(app) api.add_resource(Init, '/manage/init') #Management API for initializing the DB api.add_resource(Version, '/manage/version') #Management API for checking DB version api.add_resource(HelloWorld, '/') if __name__ == '__main__': rebuild_tables() app.run(debug=True)Notice that we rebuild the tables when the server starts up. You are welcome to update this to your liking. Maybe add some command-line arguments. Or dynamically loading all resource classes from that module. Totally up to you - this is just starter.
-
You’ll notice that we reference an
HelloWorldclass. That should be insrc/api/hello_world.pywith this content:from flask_restful import Resource from db import example class HelloWorld(Resource): def get(self): return dict(example.list_examples())Also add the
Managementclass, which will initialize/ setup the DB. This should be insrc\api\management.pywith this content:from flask_restful import Resource, reqparse, request #NOTE: Import from flask_restful, not python from db.swen610_db_utils import * from db.example import rebuild_tables class Init(Resource): def post(self): rebuild_tables() class Version(Resource): def get(self): return (exec_get_one('SELECT VERSION()')) -
Let’s run our server now. Run python
src/server.py. The console will look something like this:>python src\server.py * Serving Flask app "server" (lazy loading) * Environment: production WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead. * Debug mode: on * Restarting with windowsapi reloader * Debugger is active! * Debugger PIN: 197-473-693 * Running on http://127.0.0.1:5000/ (Press CTRL+C to quit) -
Open up a browser and go to http://127.0.0.1:5000. Do you see the
hello, worldin JSON?- Also try the command line tool
curl(usually installed on Windows PCs) to test this simple endpoint e.g.curl http://localhost:5000- You should see the same output
- Also try the command line tool
-
Now let’s add some client side test code. As in the DB project, we should test that Postgres is working. Make a file called
tests/db/test_postgresql.pyand put this in it:import unittest from tests.test_utils import * class TestPostgreSQL(unittest.TestCase): def test_can_connect(self): version = get_rest_call(self, 'http://localhost:5000/manage/version') self.assertTrue(version[0].startswith('PostgreSQL')) -
Now let’s set up our schema and test data. Create this test called
tests/db/test_db_schema.pyimport unittest from tests.test_utils import * class TestDBSchema(unittest.TestCase): def test_rebuild_tables(self): """Rebuild the tables""" post_rest_call(self, 'http://localhost:5000/manage/init') count = get_rest_call(self, 'http://localhost:5000') self.assertEqual(len(count), 1) def test_rebuild_tables_is_idempotent(self): """Drop and rebuild the tables twice""" post_rest_call(self, 'http://localhost:5000/manage/init') post_rest_call(self, 'http://localhost:5000/manage/init') count = get_rest_call(self, 'http://localhost:5000') self.assertEqual(len(count), 1) -
You’ll also need this test utility, called
tests/test_utils.pyimport requests # The client (unittest) can only contact the server using RESTful API calls # For API calls using GET. params and header are defaulted to 'empty' def get_rest_call(test, url, params = {}, get_header = {}, expected_code = 200): response = requests.get(url, params, headers = get_header) test.assertEqual(expected_code, response.status_code, f'Response code to {url} not {expected_code}') return response.json() # For API calls using POST. params and header are defaulted to 'empty' def post_rest_call(test, url, params = {}, post_header = {},expected_code = 200): '''Implements a REST api using the POST verb''' response = requests.post(url, params, headers = post_header) test.assertEqual(expected_code, response.status_code, f'Response code to {url} not {expected_code}') return response.json() # For API calls using PUT. params and header are defaulted to 'empty' def put_rest_call(test, url, params = {}, put_header = {},expected_code = 200): '''Implements a REST api using the PUT verb''' response = requests.put(url, params, headers = put_header) test.assertEqual(expected_code, response.status_code, f'Response code to {url} not {expected_code}') return response.json() # For API calls using DELETE. header is defaulted to 'empty' def delete_rest_call(test, url, delete_header={}, expected_code = 200): '''Implements a REST api using the DELETE verb''' response = requests.delete(url, headers = delete_header) test.assertEqual(expected_code, response.status_code, f'Response code to {url} not {expected_code}') return response.json() -
We are now going to the the unit tests (so you will need a 2nd terminal).
Make sure your server is still running.
The tests should pass, hopefully.
- To see details of the unittests, use the -v switch i.e. python -m unittest -v
- At this point, your file structure should look like this now:
rest-abc123/ // your username instead of abc123 | .gitignore | .gitlab-ci.yml | requirements.txt | +---config | db.yml | gitlab-credentials.yml | +---src | | server.py | | __init__.py | | | +---api | | | hello_world.py | | | management.py | | | __init__.py | | | | | | +---db | | | example.py | | | schema.sql | | | swen610_db_utils.py | | | test_data.sql | | | __init__.py | | | | | | +--tests | test_utils.py | __init__.py | +---api | | __init__.py | | | +---db | | test_db_schema.py | | test_postgresql.py | | __init__.py | | -
We’re not quite done. We need some automated tests for our API! This is crucial. We’re going to be using a Python library that simulates a browser called
Requests. The Requests library is essentially a wrapper for opening up a network socket and sending HTTP data into it, and not much more. Make a test calledtests/api/test_example.pywith this content:import unittest from tests.test_utils import * class TestExample(unittest.TestCase): def setUp(self): """Initialize DB using API call""" post_rest_call(self, 'http://localhost:5000/manage/init') print("DB Should be reset now") def test_hello_world(self): expected = { '1' : 'hello, world!' } actual = get_rest_call(self, 'http://localhost:5000') self.assertEqual(expected, actual)You may notice that we are calling a utility method called
get_rest_call()and another calledpost_rest_call(). We made this method ourselves to reduce the repeated code and test that the call was successful (you will find this function intests/test_utils.py). Feel free to make this your own.Also, you will notice we are using a RESTful API in the
setUp()function to perform the DB init within unittest setup that we used to call directly. It’s all Client-Server now, so we use themanagementendpoint to set up the DB. -
Let’s run our tests. Hopefully they pass now. (Your output should be something like this)
python -m unittest -v test_hello_world (tests.api.test_example.TestExample) ... DB Should be reset now ok test_rebuild_tables (tests.db.test_db_schema.TestDBSchema) Rebuild the tables ... ok test_rebuild_tables_is_idempotent (tests.db.test_db_schema.TestDBSchema) Drop and rebuild the tables twice ... ok test_can_connect (tests.db.test_postgresql.TestPostgreSQL) ... okYour file structure should look like this now:
rest-abc123/ // your username instead of abc123 | .gitignore | .gitlab-ci.yml | requirements.txt | +---config | db.yml | gitlab-credentials.yml | +---src | | server.py | | __init__.py | | | +---api | | | hello_world.py | | | management.py | | | __init__.py | | | | | | +---db | | | example.py | | | schema.sql | | | swen610_db_utils.py | | | test_data.sql | | | __init__.py | | | | | | +--tests | test_utils.py | __init__.py | +---api | | test_example.py | | __init__.py | | | +---db | | test_db_schema.py | | test_postgresql.py | | __init__.py | | -
Once it’s working - commit and push to GitLab. Make sure the CI works there as well.
-
Important step. Break the code in a couple of ways and note the error messages. Specifically, do the following:
- Add an Python syntax error in
src/api/hello_world.pywhile the server is running. Note that the server dies as soon as you hit Save in your editor. Going to your browser will tell you the server can’t be found. - Add a runtime exception to
src/api/hello_world.py, say addfoo.hellothat will cause aNameError: foo not found. Go to http://localhost:5000 in your browser. You’ll get a nifty debugging tool that shows you the stacktrace. Go to a line in the stacktrace and open up the console and run some Python within that stack frame. You’ll need to enter a PIN - that was printed to stdout when you started the server. - Stop the server and then run your tests. You’ll see lots of text fly by and probably an error that looks like
NewConnectionError('<urllib3.connection.HTTPConnection object at 0x03C88510>: Failed to establish a new connection).
This is a really helpful practice. Whenever you “get things working” on a new piece of technology, think about the kinds of mistakes you might make, intentionally do them, and look at how that presents itself in your development environment. That way you are less likely to get thrown off by cryptic error messages later on.
- Add an Python syntax error in
No need to tag your finished product at this time. This will be due along with
rest1.
Specific Issues
PyCharm users
Your IDE will tell you that imports cannot be resolved. To fix this, right-click src on the project and click Mark as Sources Root.
Mac users
A (recent) update on OSX uses the default Flask port (5000) for other apps (Airplay?). This prevents Flask from listening on the same port. You can either disable Airplay (probably not the best idea), or change the Flask server port.
- To change the Flask port, modify the
app.run(debug=True)line inserver.pytoapp.run(debug=True, port=4999), and make the corresponding change when calling the REST Api from the client side e.g. actual = get_rest_call(self, 'http://localhost:5000')becomesactual = get_rest_call(self, 'http://localhost:4999'). Obviously, the port doesn’t have to be4999- it just needs to be some unused port.- On some MACs
localhostdoesn’t correctly get mapped to the loopback port (don’t ask me why …). You may need to use127.0.0.1instead of localhost (or manually modify yourhostsfile)