The tool scores your answer on Structure, Completeness, Clarity, and Conciseness (0-10 each), then gives you one specific fix. No signup required.
Built with Laravel + Vue + Claude Sonnet 4.6. The scoring rubric is visible on the page + OG image.
Looking for feedback on the scoring calibration especially. Does it feel accurate to your experience?
Here is DB schema(content hash is to save money on requests with the same answer):
| # | column_name | data_type | |----|--------------|-------------------------------------------------------------| | 1 | id | bigint(20) unsigned | | 2 | job_id | char(36) | | 3 | ip_address | varchar(45) | | 4 | user_id | bigint(20) unsigned | | 5 | email | varchar(255) | | 6 | question | text | | 7 | answer | text | | 8 | content_hash | varchar(64) | | 9 | status | enum('pending','processing','complete','failed') | | 10 | result | longtext | | 11 | scores_count | int(10) unsigned | | 12 | created_at | timestamp | | 13 | updated_at | timestamp |