The NFL's rating system is a little bit more complicated that the one used by the NCAA.
In the NCAA system: assuming someone completes every one of his passes and each pass is a 99 yard touchdown pass 1261.6 is the maximum rate that a passer could have.
The NFL's system has arbitrary limits placed upon the values used to calculate their rate. In 1973 the NFL looked at the all of the available records of passing in the NFL and came up with an average in four categories: Completion %, Yards-per-Attempt, Touchdown %, and Interception %. After coming up with averages in each category they set limits on them that where roughly twice their value.
For completion percentage 77.5 % is the maximum allowed value in this equation. For yards-per-attempt 12.5 is the maximum allowed. For touchdown percentage 11.875 % is the maximum value allowed.
Unlike the NCAA rating system, the NFL's does not allow any of the four categories to fall below 0. So 0 is the maximum allowed value for interception %. Conversely 9.5 % is the minimum allowed value for that category as well.
To figure the NFL rating a number is calculated for each category with 2.375 the maximum allowed value for each.
To figure the value for completion percentage, if the value of completions divided by attempts is greater than 77.5 % then 2.375 is the value. If the percentage is not, then the value is calculated by taking the completion percentage - 30 and then dividing by 20.
To figure the value for yards-per-attempt, if the value of yards divided by attempts is greater than 12.5 then 2.375 is the value. If the average is not, then the value is calculated by taking the average - 3 and then dividing by 4.
To figure the value for touchdown percentage, if the value of touchdowns divided by attempts is greater than 11.875 % then 2.375 is the value. If the percentage is not, then the value is calculated by taking the touchdown percentage and then dividing by 5.
To figure the value for interception percentage (keeping in mind that interceptions are BAD and have to be SUBTRACTED), if the value of interceptions divided by attempts is greater than 9.5 % then 2.375 is the value. If the percentage is not, then the value is calculated by taking the interception percentage and then dividing by 4.
Once each of these values have been calculated you add them together, and then multiply 16.667 by that number.
In a nutshell (keeping in mind the maximum and minimum values) the formula is as follows:
(A=Attempts, C=Completions, Y=Yards, T=Touchdowns, I=Interceptions)
[((((C/A)*100)-30)/20) + (((Y/A)-3)/4) + (((T/A)*100)/5) + ((9.5-((I/A)*100))/4)] * 16.667
In other words if Billy Bob, the star quaterback of the hometown Thrashers, throws 20 passes and completes 12 of them (12/20 = .6 * 100 = 60 - 30 = 30 / 20 = 1.5) for 160 yards (160/20 = 8 - 3 = 5 / 4 = 1.25) with 2 touchdowns (2/20 = .1 * 100 = 10 / 5 = 2.0) and 1 interception (-1/20 = -.05 * 100 = -5 + 9.5 = 4.5 / 4 = 1.125) 97.92 (1.5 + 1.25 + 2.0 + 1.125 * 16.667) would be Billy Bob's rating.
Note: The NFL takes the total of the individual values (3.625) and then multiplies it by 100 and then divides that by 6. In the formula above the 16.667 is derived from dividing the 100 by 6. This difference will create a small discrepancy of a few tens and hundredths of a point in the final number.