Similar to the suggestion from @TenmanS14
Use a "staging table" to obtain the details from the CSV - make sure that this table is dropped each time. I've mimicked that behaviour using some dummy tables
if OBJECT_ID('staging_table') is not null DROP TABLE staging_table
select * into staging_table from csv2
So
staging_table
now contains as many columns as have been provided by the clearing house this time around. We can generate a comma separated list of those columns thus
DECLARE @listCols VARCHAR(MAX)
SELECT @listCols = COALESCE(@listCols+',' ,'') + COLUMN_NAME
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = 'Staging_Table'
Which for my sample looks like
Applicant,School_0,School_1,School_2,School_3,School_4,School_5
You can then use that to generate the dynamic SQL needed to populate the target table
DECLARE @sql nvarchar(max)
SET @sql = 'INSERT INTO [target] (' + @listCols + ') SELECT ' + @listCols + ' FROM staging_table'
exec sp_executesql @sql
7 lines of code that should be able to handle all of your files