At the start of your project you have to make a list of things that your application framework has to support. Some examples:
- CRUD functionality (Create, Read, Update, Delete)
- Menu and security features
- Use and storage of application configuration data
- Special user interface requirements
When you have built your framework, build a small part of your application. This is to verify that your framework actually works.
After that you can build the rest of your application quickly and without worries. Integrating framework-like aspects into an existing application requires a lot of refactoring, which is much more expensive than getting the framework right first. And starting full scale application development when your framework is not finished yet requires temporary stubs that need to be replaced be the real thing later, again making it much more expensive than getting the framework right first.
This is not just my opinion, it is a simple extension of all the studies that say that fixing defects becomes more expensive when done in a later phase of the project. Some people think that “First things first” only fits a waterfall approach to software development, but I disagree. In, for example, a SCRUM project the “First things” can be assigned a high priority so they will be done in the first sprints.
And I know there are many differences between building a house and building software, but would you lay the carpet before installing the plumbing?