Click here to Skip to main content
15,886,753 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
So I figured out a way to do simple regex and LL(1) parsing inside a SQL92 compliant RDBMS, using stored procs and a few tables for parsing and scanning.

The upshot of this is if you pass complicated structured textual data - anything from an URL to an email addy, to say, JSON, you can get the thing to validate and perhaps even normalize the data for you.

I don't want to expend the effort on something nobody cares about anymore though, and I've been out of business dev for years.

So is something like that useful still?

What I have tried:

I haven't tried anything yet. This isn't really that kind of question
Posted
Updated 26-Apr-19 17:05pm

1 solution

The point of procs is putting logic in the DB, and also being able to do complex actions with one DB call. I would never do this, but it comes down to where you want the logic to live. I would say no for validation, your code needs to not waste the time of your DB, and report to the user their input is invalid.
 
Share this answer
 
Comments
honey the codewitch 27-Apr-19 1:02am    
It used to be de rigueur to do as much validation as realistic in each of the tiers of a multi-tiered app. For the same reason you'd sanitize db inputs on a webpage, but also to protect against attacks if they manage to secure a user level connection to the DB directly.

But that was years ago. 3 tier is old school these days.

Basically though, you'd have the web page provide first tier validation, the middleware perform 2nd tier validation, and where possible, do validation in the DB stored procs.

that way no matter what, your data is hardened against being poisoned by external actors no matter what tier they are operating at.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900