Rewrite Qiskit C API extension-module guide (#4890)
The previous “Extend Qiskit from C” guide was very out of date, but more importantly, suggested several code patterns that are highly non-standard and require significant care and understanding to use safely.
As of Qiskit 2.4, the C API is now safely packaged up in a way that allows writing Python extension modules correctly and safely. This new form of the guide walks through the construction of a complete extension module, using the minimal amount of non-standard tooling.
Co-authored-by: Rebecca Dimock 66339736+beckykd@users.noreply.github.com Co-authored-by: abbycross across@us.ibm.com
Qiskit documentation
The documentation content home for https://quantum.cloud.ibm.com/docs and https://docs.quantum.ibm.com (excluding API reference).
Preview the main branch at the staging link: https://qiskit.github.io/documentation/main
Refer to:
Licensing
This repository is dual-licensed to distinguish between code and content.
LICENSEfile for more informationLICENSE-DOCSfile for more information.Improving IBM Quantum and Qiskit documentation
Maintaining up-to-date documentation is a huge challenge for any software project, but especially for a field like quantum computing, because advances in new research and technological capabilities come at a fast pace. As a result, we greatly appreciate anyone who takes the time to support us in keeping this content accurate and up to the highest quality standard possible, to benefit the broadest range of users.
Read on for more information about how to support this project:
Best ways to contribute to documentation
1. Report bugs, inaccuracies, or general content issues
This is the quickest, easiest, and most helpful way to contribute to this project and improve the quality of Qiskit® and IBM Quantum® documentation. There are a few different ways to report issues, depending on where it was found:
2. Suggest new content
If you think there are gaps in our documentation, or sections that could be expanded upon, we invite you to open a new content request issue.
Not every new content suggestion is a good fit for docs, nor are we able to prioritize every request immediately. However, we will do our best to respond to content requests in a timely manner, and we greatly appreciate our community’s efforts in generating new ideas.
If you are interested in writing the new content yourself, or already have some draft work you think could be integrated, please also mention that in the issue description. If your content suggestion is accepted, we will let you know and work with you to get the content written and reviewed.
Please note: we DO NOT accept unsolicited PRs for new pages or large updates to existing content. The content that we include in docs is carefully planned and curated by our content team and must go through the appropriate review process to ensure the quality is of the highest possible standard before deploying to production. As a result we are very selective with which content suggestions are approved, and it is unlikely that PRs submitted without an associated approved content request will be accepted.
3. Validate existing issues
You can help the team prioritize already-open issues by doing the following:
4. Fix an open issue
You can look through the open issues we have in this repo and address them with a PR. We recommend focusing on issues with the “good first issue” label.
Before getting started on an issue, remember to do the following:
Once you have an issue to work on, see the “How to work with this repo” section below to get going, then open a PR.
Before opening a PR, remember to do the following:
npm run checkSet up this repository
Git clone tip
This repository is very large, so it is slow to clone the first time. We recommend instead using the argument
--filter=blob:none:--filter=blob:nonemeans that Git will lazily download file contents when you need them, rather than eagerly downloading everything on the initial clone.We also recommend running this command once to tell Git to ignore the
gh-pagesbranch, which is solely used for PR previews and is very large:Git email address
To contribute code to this repository, your email must be configured properly with Git, as follows:
git config --get user.email. You do not need to make any further changes if the email address is one of the addresses set up with your github.com account.git config --global user.email <your-email>. Use an email address associated with your github.com account.Prerequisites to building the docs locally
We provide several tools to preview the documentation locally and to run quality checks. While many of these tools run in your PR through CI, we recommend installing them locally for faster iteration.
First, install the below software:
pipx install tox.Then, install the dependencies with:
Preview the docs locally
You can preview the docs locally by following these two steps:
./startin your terminal, and open http://localhost:3000 in your browser.python startinstead. Alternatively, use Windows Subsystem for Linux and run./start.The preview application does not include the top nav bar. Instead, navigate to the folder you want with the links in the home page. You can return to the home page at any time by clicking “IBM Quantum Documentation Preview” in the top-left of the header.
Maintainers: when you release a new version of the image, you need to update the image digest in
./startby following the instructions at the top of the file and opening a pull request.Tip: Periodically prune Docker
Occasionally, Docker might fail when it runs out of disk space. For example, you might encounter an error like this:
Try running
docker system pruneto clear Docker’s system space.API docs authors: How to preview your changes
API docs authors can preview their changes to one of the APIs by using the
-aparameter to specify the path to the docs folder:npm run gen-api -- -p <pkg-name> -v <version> -a <path/to/docs/_build/html>../startand open uphttp://localhost:3000, as explained in the prior section.Run quality checks
We use multiple tools to ensure that documentation meets high standards. These tools will run automatically in your PR through CI, but it is much faster to run the checks locally.
Use
./fixto automatically apply fixes, like fixing formatting. Warning: we cannot automatically fix every problem, so you should also run./checkafterwards.Use
./checkto validate that there are no issues. If you encounter an error, fix it by following the instructions in the error message, then keep running./checkand fixing any errors until it fully passes without any error.On Windows, run
python fixandpython checkinstead. Alternatively, use Windows Subsystem for Linux and run./fixand./check.VSCode: optional extensions
You may find it convenient to install the following VSCode extensions to automatically run some of our tools. Setting up these extensions is optional.
Advanced: run additional checks
We offer some tools that are not included in
./checkand./fix. Likewise, many of the checks skip API docs by default.Run
npm runto see a list of all our checks. For any particular check, runnpm run my-check -- --helpfor more information and advanced arguments, such asnpm run check:markdown -- --help.Jupyter notebooks
Add a new notebook
When adding a new notebook, you’ll need to tell the testing tools how to handle it. To do this, add the file path to
scripts/config/notebook-testing.toml. There are four categories:[groups.normal]: Notebooks to be run normally in CI. These notebooks can’t submit jobs as the queue times are too long and it will waste resources. You can interact with IBM Quantum to retrieve jobs and backend information.[groups.local-sim]: Notebooks that submit jobs, but that are small enough to run on a 5-qubit simulator. We will test these notebooks in CI by patchingleast_busyto return a 5-qubit fake backend.[groups.test-eagle]: These notebooks can’t run with a local simulator, but can be mocked with thetest-eagledevice, which returns nonsense results. We can trigger a manual job to test these withtest-eagle.[groups.cron-job-only]: For notebooks that can’t be tested using the 5-qubit simulator patch. We skip testing these in CI and instead run them twice per month. Any notebooks with cells that take more than five minutes to run are also deemed too big for CI. Try to avoid adding notebooks to this category if possible.[groups.exclude]: We never test these notebooks.If you don’t do this step, you will get the error “FAILED scripts/nb-tester/test/test_notebook_classification.py::test_all_notebooks_are_classified”.
If your notebook requires extra dependencies (such as latex or graphviz), you must also add it to the
EXTRA_DEPS_NOTEBOOKSlist in.github/workflows/main.yml. We don’t install these for every job because they take a while to install and only a handful of notebooks use them. You will know when you need them because CI will fail.Add package version information
Add a new markdown cell under your title with a
version-infotag. When you execute the notebook (see the next section), the script will populate this cell with the package versions so users can reproduce the results. Do not add anything else to this cell as the contents of the cell will be completely overwritten when it’s next executed.Execute notebooks
Before submitting a new notebook or code changes to a notebook, you must run the notebook using
tox -- --write <path-to-notebook>and commit the results. If the notebook submits jobs, also use the argument--test-strategy=hardware. This means we can be sure all notebooks work and that users will see the same results when they run using the environment we recommend.We use
toxto execute notebooks in a reproducible Python environment. First, installtoxusing pipx:You may also need to install a few system dependencies: TeX, Poppler, and graphviz. On macOS, you can run
brew install mactex-no-gui poppler graphviz. On Ubuntu, you can runapt-get install texlive-pictures texlive-latex-extra poppler-utils graphviz.--writeargument. Since we only allow writing results from real hardware, you will usually also need to pass--test-strategy=hardware. Note this means the run will use QPU time.When you make a pull request changing a notebook that doesn’t submit jobs, you can get a version of that notebook that was executed by tox from CI. To do this, click “Show all checks” in the info box at the bottom of the pull request page on GitHub, then choose “Details” for the “Test notebooks” job. From the job page, click “Summary”, then download “Executed notebooks”. Otherwise, if your notebook does submit jobs, you need to run it locally using the steps mentioned earlier.
Ignore warnings in your notebook
We don’t want users to see warnings that can be avoided, so it’s best to fix the code to avoid them. However, if a warning is unavoidable, you can stop it blocking CI by adding an
ignore-warningstag to the cell. In VSCode, right-click the cell, choose “Add cell tag”, typeignore-warnings, then press “Enter”. In Jupyter notebook (depending on version), choose View > Right Sidebar > Show Notebook Tools, then under “Common Tools” add a tag with textignore-warnings.Add extra code checks to your notebook
Our CI checks notebooks run from start to finish without errors or warnings. You can add extra checks in notebooks to catch other unexpected behavior.
For example, say we claim a cell always returns the string
0011. It would be embarassing if this was not true. We can assert this in CI by adding the following code cell, and hide it from users with aremove-celltag.In Jupyter notebooks, the underscore
_variable stores the value of the previous cell output. You should also add a comment like# Confirm output is what we expectso that authors know this block is only for testing. Make sure you add the tagremove-cell. If something ever causes this value to change, CI will alert us.Format README and TypeScript files
Run
npm run fmtto automatically format the README,.githubfolder, andscripts/folder. You should run this command if you get the error in CIrun Prettier to fix.To check that formatting is valid without actually making changes, run
npm run check:fmt.Regenerate an existing API docs version
This is useful when we make improvements to the API generation script.
You can regenerate all API docs versions following these steps:
mainusinggit checkout -b <branch-name>.git statusand creating a new commit for them if necessary.npm run regen-apisto regenerate all API docs versions forqiskit,qiskit-ibm-runtime, andqiskit-ibm-transpiler.Each regenerated version will be saved as a distinct commit. If the changes are too large for one single PR, consider splitting it up into multiple PRs by using
git cherry-pickorgit rebase -iso each PR only has the commits it wants to target.If you only want to regenerate the latest stable minor release of each package, then add
--current-apis-onlyas an argument, and in case you only want to regenerate versions of one package, then you can use the-p <pkg-name>argument.Alternatively, you can also regenerate one specific version:
qiskit,qiskit-ibm-runtime, orqiskit-ibm-transpiler) and its version.npm run gen-api -- -p <pkg-name> -v <version>, e.g.,npm run gen-api -- -p qiskit -v 0.45.0If the version is not for the latest stable minor release series, then add
--historicalto the arguments. For example, use--historicalif the latest stable release is 0.45.* but you’re generating docs for the patch release 0.44.3.Additionally, If you are regenerating a dev version, then you can add
--devas an argument and the documentation will be built at/docs/api/<pkg-name>/dev. For dev versions, end the--versionin-dev, e.g.-v 1.0.0-dev. If a release candidate has already been released, use-v 1.0.0rc1, for example.In this case, no commit will be automatically created.
Generate new API docs
Use this process when we want to publish new API docs, such as when we release a new version of a package like Qiskit SDK.
Pre-requisite: GitHub token
You must have the environment variable
GITHUB_TOKENsaved to your environment. You can check if it is by runningecho $GITHUB_TOKENin your terminal.The easiest way to set up a new token is with the GitHub CLI:
gh auth login. Follow the instructions to log-in in to github.com (not enterprise GitHub).gh auth status --show-token. Find the token forgithub.com; make sure it is not for enterprise GitHub.GITHUB_TOKENto your environment, such as by updating~/.zprofilewithexport GITHUB_TOKEN=<token>.Key terms
The process depends on which type of release you are generating. Some key terms to know:
qiskit/documentationrepository, the latest docs are at the top-level of the API folder, e.g., the files atdocs/api/qiskit(notdocs/api/qiskit/devordocs/api/qiskit/2.3).qiskit/documentationrepository, historical docs are stored in a folder with their number, such asdocs/api/qiskit/0.46anddocs/api/qiskit/2.3.mainbranch several times a week; the other packages do not have dev docs. In theqiskit/documentationrepository, dev docs are stored in thedevfolder, e.g.,docs/api/qiskit/dev.dev/folder, likedocs/api/qiskit/dev.Initial steps
All release types start with the following steps:
Qiskit/documentationand pull any updates. Then, create a new Git branch.qiskitorqiskit-ibm-runtime) and its full version, e.g.,0.45.2or1.2.0rc1.0.45.2. For therc1release, look formainin blue text and a run description like “Prepare 2.3.0rc1”.html_docs.0.45.2, renamehtml_docs.zipto0.45.zip. For release candidates (rc), use a value like2.3-rc.zip.rc1release or a new minor version like2.3.0, it will be a new file.Final steps for patch releases
Examples of when to use this process:
2.3.0, and2.3.1is released2.3.2to2.3.32.3.1rc1, and2.3.1.rc2is releasedSteps:
npm run gen-api -- -p <pkg-name> -v <version> --dev, e.g.,npm run gen-api -- -p qiskit -v 2.3.0rc2 --dev.npm run gen-api -- -p <pkg-name> -v <version> --historical, e.g.,npm run gen-api -- -p qiskit -v 2.3.2 --historical.npm run gen-api -- -p <pkg-name> -v <version>, e.g.,npm run gen-api -- -p qiskit -v 2.3.2.-p qiskit-cinstead of-p qiskit.Final steps for the rc1 release
This process is only for the first release candidate (rc1). Subsequent release candidates like rc2 should use the process for patch releases.
Copy shared linkbutton.People with the linkfrom the menu under “Share Link” (default isInvited people only) and go toLink Settings.Link Expiration, selectDisable Shared Link onand set an expiration date of ~10 years into the future. (There must be an expiration date.)Shared Link Settingstab. Do not use the link from the prior screen.scripts/config/api-html-artifacts.jsonand find thedeventry for the package, likeqiskit.devlink with the Box link, rather than the GitHub link.npm run gen-api -- -p <pkg-name> -v <version> --dev, e.g.,npm run gen-api -- -p qiskit -v 2.3.0rc1 --dev.-p qiskit-cinstead of-p qiskit.Final steps for a new minor or major version
Examples of when to use this process:
2.3.1, and2.4.0is released2.3.1, and3.0.0is releasedCopy shared linkbutton.People with the linkfrom the menu under “Share Link” (default isInvited people only) and go toLink Settings.Link Expiration, selectDisable Shared Link onand set an expiration date of ~10 years into the future. (There must be an expiration date.)Shared Link Settingstab. Do not use the link from the prior screen.scripts/config/api-html-artifacts.jsonby adding the new version with the direct link from step 9._package.jsonfile in the package’s top-level folder, such asdocs/api/qiskit/_package.json.npm run gen-api -- -p <pkg-name> -v <version> --historical, using the version from the previous step. For example,npm run gen-api -- -p qiskit -v 0.2.1 --historical.-p qiskit-cinstead of-p qiskit.npm run gen-api -- -p <pkg-name> -v <version>, using the version from the new release. For example,npm run gen-api -- -p qiskit -v 0.3.0.-p qiskit-cinstead of-p qiskit.mainbranch by looking at the middle column with the blue text; look formain.html_docs.6026447195from the linkhttps://github.com/Qiskit/qiskit/actions/runs/23345366690/artifacts/6026447195.api-html-artifacts.json, update thedeventry with the following value. Replace<NUMBER>with the number from the prior step. Qiskit:https://api.github.com/repos/Qiskit/qiskit/actions/artifacts/<NUMBER>/zip. Qiskit Runtime:https://api.github.com/repos/Qiskit/qiskit-ibm-runtime/actions/artifacts/<NUMBER>/zip-dev. For example, if the latest release is2.3.0, then the dev version would be2.4.0-dev.npm run gen-api -- -p <pkg-name> -v <version> --dev, e.g.,npm run gen-api -- -p qiskit -v 2.4.0-dev --dev.-p qiskit-cinstead of-p qiskit./devfolder.View diff for
objects.inv(advanced)Since
objects.invis compressed, we can’t review changes throughgit diff. Git does tell you if the file has changed, but this isn’t that helpful as the compressed file can be different even if the uncompressed contents are the same. If you want to see the diff for the uncompressed contents, first installsphobjinv.The add the following to your
.gitconfig(usually found at~/.gitconfig).Dependabot - upgrade notebook testing version
When a new version of an API is released, we should also update
nb-tester/requirements.txtto ensure that our notebooks still work with the latest version of the API. You can do this upgrade either manually or wait for Dependabot’s automated PR.CI will fail on Dependabot’s PR at first due to not having access to the token. To fix this, have someone with write access close and then immediately reopen the PR to trigger CI.
You can land the API generation separately from the
requirements.txtversion upgrade. It’s high priority to get out new versions of the API docs ASAP, so you should not block that on the notebook version upgrade if you run into any complications like failing notebooks.Deploy guides & API docs
See the section “Syncing content with open source repo” in the internal docs repo’s README.