Attention This is a hobby project to get more used to go-programming. It is not intended to be used in a production environment without making further security related steps.
How it works
tinymfa connects to a postgres database and creates the required table structures. Then, it generates a root encryption key and access token. The encryption key is stored on the filesystem.
when creating an issuer, a new encryption key is generated, encrypted with the root encryption key and then stored to the database. Also, an access token unique to this issuer is generated as well.
when creating a user below an issuer, a new secret key is generated and encrypted with the issuer encryption key.
The api offers an endpoint to generate a QRCode for a user. Use this to let the user register their secret key in an Authenticator App
The api offers an endpoint to validate a token. Send the token using a http post request to the api interface. The resulting json object contains the boolean result of the validation.
tinymfa can be configured to validate access to its resources. Once activated, tinymfa checks for presence of the http header key ‚tiny-mfa-access-token‘. This must be either the root token created on installation, or the issuer token presented upon issuer creation.
System Configuration and Audit
Return audit entries
Return current system configuration
Updates the system configuration
payload: Update system configuration
the port to run on. Requires a restart!
how many times is a user allowed to input a wrong token before we don’t allow validation for the given message. This is to defeat brute force attacks
whether to verify if the tiny-mfa-access-token is set and contains a valid token
7.2 introduced a new functionality for multifactor authentication. To leverage
this functionality without having the need of owning a RSA- or DUO account, the
TinyMFA Plugin was created.
implements the RFC for TOTP based tokens, so (in theory) it is compatible with
any app that can register QR-Codes and calculate TOTP tokens (tested with
google-authenticator, DUO mobile and FreeOTP). To maximize security, each user
gets a unique 128bit key assigned that will be used to calculate the token.
ships with a MFA configuration, a DynamicScope and assignable Capabilities to
grant users access to their personalized token (via QR-Code) as well as forcing
users to authenticate with a token.
installing the plugin, a new Capability “TinyMFA Plugin Access” is introduced.
This grants access to the plugin page, where you can see your personalized QR
Code as well as checking whether your authenticator was registered successfully.
Capability “TinyMFA activated Identity” can be assigned that results in every
user having this capability assigned being part of the Dynamic Scope “TinyMFA
identity having this capability assigned are forced to authenticate via a
the Capability “TinyMFA Administrator” grants access to a simple admin page,
allowing the administrator to review login attempts and disable/enable
June, 2016: An initiative wants to bring SSL to the people. Let’s Encrypt is out of beta status, already serving 1,3 million certificates (according to them) and is eager to tell everyone that they happily deliver SSL to everyone, without exception.
Well, except to those that use IPv6! Really, what is wrong with you people at Let’s Encrypt? Do you realize that you will get SSL certificates cheaper than an IPv4 address these days? It is because of people like you that we still need to talk about a change to IPv6. This has been necessary for 10 years at this time of writing.
This really gets me angry. Not because I urgently need their SSL certificates, but because I had to realize this after installing 350 Megabytes of Phython dependencies for their certbot tool.
Also it doesn’t help that they are making good progress on this since end of January. Honestly, Let’s Encrypt, what’s the matter? Are you confused by those long addresses?
Thanks for nothing, Let’s Encrypt. Please do your homework, your 1,3 million certificates don’t impress me. Get the basics straight!
„What is he talking about? CSV is a great format, and the easiest to use by far! You just String.split() by ‚;‘ and are done!!!“
Well, I’ll try to be honest: You are very wrong.
First of all: CSV is not a format. It is a bunch of values, separated by a character (this is what CSV stands for: Character Separated Values). There might be two of them. Or twenty. Or twohundred. We don’t know until we parse it. But even then we cannot say for sure that the column we are currently processing actually contains the data we are expecting. We rely on hope. Therefore we write down some expectations on the CSV file and call this format. Sometimes this works.
Second: If you are talking about the easyness of handling CSV, you are talking about that colleague that exports her Excel sheets to a .csv file. Yeah, that’s pretty easy. So, let’s be honest on this one, too: We decide to use CSV, because it is the easiest thing to do for the customer. She does not need a specialized application to create structured data with a strict format. She just needs to open Excel, fill in a few columns and then click on „Save as“.
As nice as it is to our beloved customer: This is where every implementer’s nightmare starts. Because the customer does not care how this CSV file looks like. What she does care is what her pretty Excel Sheet looks like, because this is what will (probably) be seen by someone (important). So she will be using all her Excel Skills and every aestethic sense to create a great looking Workbook, including carriage returns and almost every obscure UTF-8 character the codepage has to offer. And I cannot even be mad at her: If the CSV shall include, let’s say, a column for a description of something, I do not want to put that 2000 character description in one single line. If you have ever tried to do that with Excel will will know that you almost are forced to use carriage returns!
But this leads us to
Third: You cannot just String.split() by ‚;‘ and are done. You need to check if the columns are, by chance, also surrounded by quotes, because if so, you need to be aware of the fact that each column might include at least one ocurrence of your separator ‚;‘, which belongs to the value and shall not be considered as your separator. You also have to consider carriage returns, therefore you need to implement readahead of your CSV file. You also can never be sure if the customer did not move a column to another or splits one into two because „it looks more pretty“.
The list goes on and on, and over the time you will be implementing a monster in form of your very own CSV parser. By this time you will have also created your very own „format“ of a CSV file, because after several months of bugfixing „that crappy parser thing“ (that, by the way, has destroyed some of your reputation) you will have implemented at least some basic boundary checking on the supplied file. You will also be very tired at this point. Even if you were clever enough to use one of the several CSV parsers out there, you will still have a hard time.
Because you are fighting human creativity. You won’t believe what people did to their Excel Sheet just to make it look „perfect“.
Because the customer is not aware that her creativity is causing problems. She will be even mad at you because you force her to make her Excel Sheet look ugly.
Because you are forgetting one of the most important rules: Fix the sender, not the receiver.
Here we are, looking at Excel. And Excel does not do ANY checks for you. It just exports CSV. And most of the time it is capable of importing its own CSV export. Most of the time…
So, what to do? Well, I think it depends on the complexity of the data. Of course a CSV processor that processes data created by another machine that also follows strict patterns will be implemented very fast. But as soon as Excel Sheets are involved that transfer more than, let’s say, 10 Columns and 20 lines, you better think about doing something else. From my personal experience, implementing a specialized GUI will help you check the boundaries as your customer starts to input data. Of course it will take more time to deliver. But I bet you will save several weeks of work fixing another issue that came up with the latest CSV file. Also, you have absolute control over the to-be-used export format. By using annotations on Java classes you are able to marshall data to XML and unmarshall it back to objects without even thinking about how to parse this format.
Using SailPoint’s Service Standard Build is okay if you need a quick and easy to setup buildenvironment. However, when you enter projects at bigger companies, you will realize that those companies rely on build systems around git/mercurial and Maven. This is a way of setting up a build using maven. Creating SailPoint IdentityIQ WAR file with Maven weiterlesen →