Historical and Revision Notes

Revised

Section

Source (U.S. Code)

Source (Statutes at Large)

11301

40:1411.

Pub. L. 104–106, div. E, title LI, § 5111, Feb. 10, 1996, 110 Stat. 680.

Statutory Notes and Related Subsidiaries
Advancing American Artificial Intelligence

Pub. L. 117–263, div. G, title LXXII, subtitle B, Dec. 23, 2022, 136 Stat. 3668, provided that:

“SEC. 7221.
SHORT TITLE.

“This subtitle may be cited as the ‘Advancing American AI Act’.

“SEC. 7222.
PURPOSES.
“The purposes of this subtitle are to—
“(1)
encourage agency artificial intelligence-related programs and initiatives that enhance the competitiveness of the United States and foster an approach to artificial intelligence that builds on the strengths of the United States in innovation and entrepreneurialism;
“(2)
enhance the ability of the Federal Government to translate research advances into artificial intelligence applications to modernize systems and assist agency leaders in fulfilling their missions;
“(3)
promote adoption of modernized business practices and advanced technologies across the Federal Government that align with the values of the United States, including the protection of privacy, civil rights, and civil liberties; and
“(4)
test and harness applied artificial intelligence to enhance mission effectiveness, agency program integrity, and business practice efficiency.
“SEC. 7223.
DEFINITIONS.
“In this subtitle:
“(1)
Agency.—
The term ‘agency’ has the meaning given the term in section 3502 of title 44, United States Code.
“(2)
Appropriate congressional committees.—
The term ‘appropriate congressional committees’ means—
“(A)
the Committee on Homeland Security and Governmental Affairs of the Senate;
“(B)
the Committee on Oversight and Reform [now Committee on Oversight and Accountability] of the House of Representatives; and
“(C)
the Committee on Homeland Security of the House of Representatives.
“(3)
Artificial intelligence.—
The term ‘artificial intelligence’ has the meaning given the term in section 238(g) of the John S. McCain National Defense Authorization Act for Fiscal Year 2019 (10 U.S.C. 2358 note).
“(4)
Artificial intelligence system.—
The term ‘artificial intelligence system’—
“(A)
means any data system, software, application, tool, or utility that operates in whole or in part using dynamic or static machine learning algorithms or other forms of artificial intelligence, whether—
“(i)
the data system, software, application, tool, or utility is established primarily for the purpose of researching, developing, or implementing artificial intelligence technology; or
“(ii)
artificial intelligence capability is integrated into another system or agency business process, operational activity, or technology system; and
“(B)
does not include any common commercial product within which artificial intelligence is embedded, such as a word processor or map navigation system.
“(5)
Department.—
The term ‘Department’ means the Department of Homeland Security.
“(6)
Director.—
The term ‘Director’ means the Director of the Office of Management and Budget.
“SEC. 7224.
PRINCIPLES AND POLICIES FOR USE OF ARTIFICIAL INTELLIGENCE IN GOVERNMENT.
“(a)
Guidance.—
The Director shall, when developing the guidance required under section 104(a) of the AI in Government Act of 2020 (title I of division U of Public Law 116–260) [see note below], consider—
“(1)
the considerations and recommended practices identified by the National Security Commission on Artificial Intelligence in the report entitled ‘Key Considerations for the Responsible Development and Fielding of AI’, as updated in April 2021;
“(2)
the principles articulated in Executive Order 13960 (85 Fed. Reg. 78939 [40 U.S.C. 11301 note]; relating to promoting the use of trustworthy artificial intelligence in Government); and
“(3)
the input of—
“(A)
the Administrator of General Services;
“(B)
relevant interagency councils, such as the Federal Privacy Council, the Chief Financial Officers Council, the Chief Information Officers Council, and the Chief Data Officers Council;
“(C)
other governmental and nongovernmental privacy, civil rights, and civil liberties experts;
“(D)
academia;
“(E)
industry technology and data science experts; and
“(F)
any other individual or entity the Director determines to be appropriate.
“(b)
Department Policies and Processes for Procurement and Use of Artificial Intelligence-enabled Systems.—
Not later than 180 days after the date of enactment of this Act [Dec. 23, 2022]—
“(1)
the Secretary of Homeland Security, with the participation of the Chief Procurement Officer, the Chief Information Officer, the Chief Privacy Officer, and the Officer for Civil Rights and Civil Liberties of the Department and any other person determined to be relevant by the Secretary of Homeland Security, shall issue policies and procedures for the Department related to—
“(A)
the acquisition and use of artificial intelligence; and
“(B)
considerations for the risks and impacts related to artificial intelligence-enabled systems, including associated data of machine learning systems, to ensure that full consideration is given to—
“(i)
the privacy, civil rights, and civil liberties impacts of artificial intelligence-enabled systems; and
“(ii)
security against misuse, degradation, or rending inoperable of artificial intelligence-enabled systems; and
“(2)
the Chief Privacy Officer and the Officer for Civil Rights and Civil Liberties of the Department shall report to Congress on any additional staffing or funding resources that may be required to carry out the requirements of this subsection.
“(c)
Inspector General.—
Not later than 180 days after the date of enactment of this Act, the Inspector General of the Department shall identify any training and investments needed to enable employees of the Office of the Inspector General to continually advance their understanding of—
“(1)
artificial intelligence systems;
“(2)
best practices for governance, oversight, and audits of the use of artificial intelligence systems; and
“(3)
how the Office of the Inspector General is using artificial intelligence to enhance audit and investigative capabilities, including actions to—
“(A)
ensure the integrity of audit and investigative results; and
“(B)
guard against bias in the selection and conduct of audits and investigations.
“(d)
Artificial Intelligence Hygiene and Protection of Government Information, Privacy, Civil Rights, and Civil Liberties.—
“(1)
Establishment.—
Not later than 1 year after the date of enactment of this Act, the Director, in consultation with a working group consisting of members selected by the Director from appropriate interagency councils, shall develop an initial means by which to—
“(A)
ensure that contracts for the acquisition of an artificial intelligence system or service—
“(i)
align with the guidance issued to the head of each agency under section 104(a) of the AI in Government Act of 2020 (title I of division U of Public Law 116–260);
“(ii)
address protection of privacy, civil rights, and civil liberties;
“(iii)
address the ownership and security of data and other information created, used, processed, stored, maintained, disseminated, disclosed, or disposed of by a contractor or subcontractor on behalf of the Federal Government; and
“(iv)
include considerations for securing the training data, algorithms, and other components of any artificial intelligence system against misuse, unauthorized alteration, degradation, or rendering inoperable; and
“(B)
address any other issue or concern determined to be relevant by the Director to ensure appropriate use and protection of privacy and Government data and other information.
“(2)
Consultation.—
In developing the considerations under paragraph (1)(A)(iv), the Director shall consult with the Secretary of Homeland Security, the Secretary of Energy, the Director of the National Institute of Standards and Technology, and the Director of National Intelligence.
“(3)
Review.—
The Director—
“(A)
should continuously update the means developed under paragraph (1); and
“(B)
not later than 2 years after the date of enactment of this Act and not less frequently than every 2 years thereafter, shall update the means developed under paragraph (1).
“(4)
Briefing.—
The Director shall brief the appropriate congressional committees—
“(A)
not later than 90 days after the date of enactment of this Act and thereafter on a quarterly basis until the Director first implements the means developed under paragraph (1); and
“(B)
annually thereafter on the implementation of this subsection.
“(5)
Sunset.—
This subsection shall cease to be effective on the date that is 5 years after the date of enactment of this Act.
“SEC. 7225.
AGENCY INVENTORIES AND ARTIFICIAL INTELLIGENCE USE CASES.
“(a)
Inventory.—
Not later than 60 days after the date of enactment of this Act [Dec. 23, 2022], and continuously thereafter for a period of 5 years, the Director, in consultation with the Chief Information Officers Council, the Chief Data Officers Council, and other interagency bodies as determined to be appropriate by the Director, shall require the head of each agency to—
“(1)
prepare and maintain an inventory of the artificial intelligence use cases of the agency, including current and planned uses;
“(2)
share agency inventories with other agencies, to the extent practicable and consistent with applicable law and policy, including those concerning protection of privacy and of sensitive law enforcement, national security, and other protected information; and
“(3)
make agency inventories available to the public, in a manner determined by the Director, and to the extent practicable and in accordance with applicable law and policy, including those concerning the protection of privacy and of sensitive law enforcement, national security, and other protected information.
“(b)
Central Inventory.—
The Director is encouraged to designate a host entity and ensure the creation and maintenance of an online public directory to—
“(1)
make agency artificial intelligence use case information available to the public and those wishing to do business with the Federal Government; and
“(2)
identify common use cases across agencies.
“(c)
Sharing.—
The sharing of agency inventories described in subsection (a)(2) may be coordinated through the Chief Information Officers Council, the Chief Data Officers Council, the Chief Financial Officers Council, the Chief Acquisition Officers Council, or other interagency bodies to improve interagency coordination and information sharing for common use cases.
“(d)
Department of Defense.—
Nothing in this section shall apply to the Department of Defense.
“SEC. 7226.
RAPID PILOT, DEPLOYMENT AND SCALE OF APPLIED ARTIFICIAL INTELLIGENCE CAPABILITIES TO DEMONSTRATE MODERNIZATION ACTIVITIES RELATED TO USE CASES.
“(a)
Identification of Use Cases.—
Not later than 270 days after the date of enactment of this Act [Dec. 23, 2022], the Director, in consultation with the Chief Information Officers Council, the Chief Data Officers Council, the Chief Financial Officers Council, and other interagency bodies as determined to be appropriate by the Director, shall identify 4 new use cases for the application of artificial intelligence-enabled systems to support interagency or intra-agency modernization initiatives that require linking multiple siloed internal and external data sources, consistent with applicable laws and policies, including those relating to the protection of privacy and of sensitive law enforcement, national security, and other protected information.
“(b)
Pilot Program.—
“(1)
Purposes.—
The purposes of the pilot program under this subsection include—
“(A)
to enable agencies to operate across organizational boundaries, coordinating between existing established programs and silos to improve delivery of the agency mission;
“(B)
to demonstrate the circumstances under which artificial intelligence can be used to modernize or assist in modernizing legacy agency systems; and
“(C)
to leverage commercially available artificial intelligence technologies that—
“(i)
operate in secure cloud environments that can deploy rapidly without the need to replace existing systems; and
“(ii)
do not require extensive staff or training to build.
“(2)
Deployment and pilot.—
Not later than 1 year after the date of enactment of this Act, the Director, in coordination with the heads of relevant agencies and Federal entities, including the Administrator of General Services, the Bureau of Fiscal Service of the Department of the Treasury, the Council of the Inspectors General on Integrity and Efficiency, and the Pandemic Response Accountability Committee, and other officials as the Director determines to be appropriate, shall ensure the initiation of the piloting of the 4 new artificial intelligence use case applications identified under subsection (a), leveraging commercially available technologies and systems to demonstrate scalable artificial intelligence-enabled capabilities to support the use cases identified under subsection (a).
“(3)
Risk evaluation and mitigation plan.—
In carrying out paragraph (2), the Director shall require the heads of agencies to—
“(A)
evaluate risks in utilizing artificial intelligence systems; and
“(B)
develop a risk mitigation plan to address those risks, including consideration of—
“(i)
the artificial intelligence system not performing as expected or as designed;
“(ii)
the quality and relevancy of the data resources used in the training of the algorithms used in an artificial intelligence system;
“(iii)
the processes for training and testing, evaluating, validating, and modifying an artificial intelligence system; and
“(iv)
the vulnerability of a utilized artificial intelligence system to unauthorized manipulation or misuse, including the use of data resources that substantially differ from the training data.
“(4)
Prioritization.—
In carrying out paragraph (2), the Director shall prioritize modernization projects that—
“(A)
would benefit from commercially available privacy-preserving techniques, such as use of differential privacy, federated learning, and secure multiparty computing; and
“(B)
otherwise take into account considerations of civil rights and civil liberties.
“(5)
Privacy protections.—
In carrying out paragraph (2), the Director shall require the heads of agencies to use privacy-preserving techniques when feasible, such as differential privacy, federated learning, and secure multiparty computing, to mitigate any risks to individual privacy or national security created by a project or data linkage.
“(6)

AI in Government

Pub. L. 116–260, div. U, title I, Dec. 27, 2020, 134 Stat. 2286, provided that:

“SEC. 101.
SHORT TITLE.

“This title may be cited as the ‘AI in Government Act of 2020’.

“SEC. 102.
DEFINITIONS.
“In this Act [probably means “this title”]—
“(1)
the term ‘Administrator’ means the Administrator of General Services;
“(2)
the term ‘agency’ has the meaning given the term in section 3502 of title 44, United States Code;
“(3)
the term ‘AI CoE’ means the AI Center of Excellence described in section 103;
“(4)
the term ‘artificial intelligence’ has the meaning given the term in section 238(g) of the John S. McCain National Defense Authorization Act for Fiscal Year 2019 (10 U.S.C. 2358 note);
“(5)
the term ‘Director’ means the Director of the Office of Management and Budget;
“(6)
the term ‘institution of higher education’ has the meaning given the term in section 101 of the Higher Education Act of 1965 (20 U.S.C. 1001); and
“(7)
the term ‘nonprofit organization’ means an organization described in section 501(c)(3)of [sic] the Internal Revenue Code of 1986 [26 U.S.C. 501(c)(3)] and exempt from taxation under section 501(a) of that Code [26 U.S.C. 501(a)].
“SEC. 103.
AI CENTER OF EXCELLENCE.
“(a)
In General.—
There is created within the General Services Administration a program to be known as the ‘AI Center of Excellence’, which shall—
“(1)
facilitate the adoption of artificial intelligence technologies in the Federal Government;
“(2)
improve cohesion and competency in the adoption and use of artificial intelligence within the Federal Government; and
“(3)
carry out paragraphs (1) and (2) for the purposes of benefitting the public and enhancing the productivity and efficiency of Federal Government operations.
“(b)
Duties.—
The duties of the AI CoE shall include—
“(1)
regularly convening individuals from agencies, industry, Federal laboratories, nonprofit organizations, institutions of higher education, and other entities to discuss recent developments in artificial intelligence, including the dissemination of information regarding programs, pilots, and other initiatives at agencies, as well as recent trends and relevant information on the understanding, adoption, and use of artificial intelligence;
“(2)
collecting, aggregating, and publishing on a publicly available website information regarding programs, pilots, and other initiatives led by other agencies and any other information determined appropriate by the Administrator;
“(3)
advising the Administrator, the Director, and agencies on the acquisition and use of artificial intelligence through technical insight and expertise, as needed;
“(4)
assist agencies in applying Federal policies regarding the management and use of data in applications of artificial intelligence;
“(5)
consulting with agencies, including the Department of Defense, the Department of Commerce, the Department of Energy, the Department of Homeland Security, the Office of Management and Budget, the Office of the Director of National Intelligence, and the National Science Foundation, that operate programs, create standards and guidelines, or otherwise fund internal projects or coordinate between the public and private sectors relating to artificial intelligence;
“(6)
advising the Director on developing policy related to the use of artificial intelligence by agencies; and
“(7)
advising the Director of the Office of Science and Technology Policy on developing policy related to research and national investment in artificial intelligence.
“(c)
Staff.—
“(1)
In general.—
The Administrator shall provide necessary staff, resources, and administrative support for the AI CoE.
“(2)
Shared staff.—
To the maximum extent practicable, the Administrator shall meet the requirements described under paragraph (1) by using staff of the General Services Administration, including those from other agency centers of excellence, and detailees, on a reimbursable or nonreimbursable basis, from other agencies.
“(3)
Fellows.—
The Administrator may, to the maximum extent practicable, appoint fellows to participate in the AI CoE from nonprofit organizations, think tanks, institutions of higher education, and industry.
“(d)
Sunset.—
This section shall cease to be effective on the date that is 5 years after the date of enactment of this Act [Dec. 27, 2020].
“SEC. 104.
GUIDANCE FOR AGENCY USE OF ARTIFICIAL INTELLIGENCE.
“(a)
Guidance.—
Not later than 270 days after the date of enactment of this Act [Dec. 27, 2020], the Director, in coordination with the Director of the Office of Science and Technology Policy in consultation with the Administrator and any other relevant agencies and key stakeholders as determined by the Director, shall issue a memorandum to the head of each agency that shall—
“(1)
inform the development of policies regarding Federal acquisition and use by agencies regarding technologies that are empowered or enabled by artificial intelligence, including an identification of the responsibilities of agency officials managing the use of such technology;
“(2)
recommend approaches to remove barriers for use by agencies of artificial intelligence technologies in order to promote the innovative application of those technologies while protecting civil liberties, civil rights, and economic and national security;
“(3)
identify best practices for identifying, assessing, and mitigating any discriminatory impact or bias on the basis of any classification protected under Federal nondiscrimination laws, or any unintended consequence of the use of artificial intelligence, including policies to identify data used to train artificial intelligence algorithms as well as the data analyzed by artificial intelligence used by the agencies; and
“(4)
provide a template of the required contents of the agency plans described in subsection (c).
“(b)
Public Comment.—
To help ensure public trust in the applications of artificial intelligence technologies, the Director shall issue a draft version of the memorandum required under subsection (a) for public comment not later than 180 days after [the] date of enactment of this Act.
“(c)
Plans.—
Not later than 180 days after the date on which the Director issues the memorandum required under subsection (a) or an update to the memorandum required under subsection (d), the head of each agency shall submit to the Director and post on a publicly available page on the website of the agency—
“(1)
a plan to achieve consistency with the memorandum; or
“(2)
a written determination that the agency does not use and does not anticipate using artificial intelligence.
“(d)
Updates.—
Not later than 2 years after the date on which the Director issues the memorandum required under subsection (a), and every 2 years thereafter for 10 years, the Director shall issue updates to the memorandum.
“SEC. 105.
UPDATE OF OCCUPATIONAL SERIES FOR ARTIFICIAL INTELLIGENCE.
“(a)
In General.—
Not later than 18 months after the date of enactment of this Act [Dec. 27, 2020], and in accordance with chapter 51 of title 5, United States Code, the Director of the Office of Personnel Management shall—
“(1)
identify key skills and competencies needed for positions related to artificial intelligence;
“(2)
establish an occupational series, or update and improve an existing occupational job series, to include positions the primary duties of which relate to artificial intelligence;
“(3)
to the extent appropriate, establish an estimate of the number of Federal employees in positions related to artificial intelligence, by each agency; and
“(4)
using the estimate established in paragraph (3), prepare a 2-year and 5-year forecast of the number of Federal employees in positions related to artificial intelligence that each agency will need to employ.
“(b)
Plan.—
Not later than 120 days after the date of enactment of this Act, the Director of the Office of Personnel Management shall submit to the Committee on Homeland Security and Governmental Affairs of the Senate and the Committee on Oversight and Reform [now Committee on Oversight and Accountability] of the House of Representatives a comprehensive plan with a timeline to complete requirements described in subsection (a).”

GSA Modernization Centers of Excellence Program

Pub. L. 116–194, § 2, Dec. 3, 2020, 134 Stat. 981, provided that:

“(a)
Definitions.—
In this section:
“(1)
Cloud computing.—
The term ‘cloud computing’ has the meaning given the term in section 1076 of the National Defense Authorization Act for Fiscal Year 2018 [Pub. L. 115–91] (40 U.S.C. 11301 note) [set out below].
“(2)
Executive agency.—
The term ‘executive agency’ has the meaning given the term ‘Executive agency’ in section 105 of title 5, United States Code.
“(3)
Program.—
The term ‘Program’ means the Information Technology Modernization Centers of Excellence Program established under subsection (b).
“(b)
Establishment.—
The Administrator of General Services shall establish a program to be known as the Information Technology Modernization Centers of Excellence Program to facilitate the adoption of modern technology by executive agencies on a reimbursable basis.
“(c)
Responsibilities.—
The Program shall have the following responsibilities:
“(1)
To encourage the modernization of information technology used by an executive agency and how a customer interacts with an executive agency.
“(2)
To improve cooperation between commercial and executive agency information technology sectors.
“(3)
To the extent practicable, encourage the adoption of commercial items in accordance with section 3307 of title 41, United States Code.
“(4)
Upon request by the executive agency, to assist executive agencies with planning and adoption of technology in focus areas designated by the Administrator, which may include the following:
“(A)
A commercial cloud computing system that includes—
“(i)
end-to-end migration planning and an assessment of progress towards modernization; and
“(ii)
a cybersecurity and governance framework that promotes industry and government risk management best practice approaches, prioritizing efforts based on risk, impact, and consequences.
“(B)
Tools to help an individual receive support from and communicate with an executive agency.
“(C)
Contact centers and other related customer supports.
“(D)
Efficient use of data management, analysis, and reporting.
“(E)
The optimization of infrastructure, including for data centers, and the reduction of operating costs.
“(F)
Artificial intelligence.
“(5)
To share best practices and expertise with executive agencies.
“(6)
Other responsibilities the Administrator may identify.
“(d)
Coordination.—
The Administrator shall coordinate with the Secretary of Homeland Security in establishing the Program to ensure that the technology, tools, and frameworks facilitated for executive agencies by the Program provide sufficient cybersecurity and maintain the integrity, confidentiality, and availability of Federal information.
“(e)
Program Reporting.—
Not later than 1 year after the date of enactment of this Act [Dec. 3, 2020], and every year thereafter, the Administrator shall submit to the Committee on Homeland Security and Governmental Affairs of the Senate and the Committee on Oversight and Reform [now Committee on Oversight and Accountability] of the House of Representatives a report on the Program, which shall include the following:
“(1)
A description of the reimbursable agreements, statements of work, and associated project schedules and deliverables for the Program.
“(2)
Details on the total amount of the reimbursable agreements.
“(3)
Any additional information the Administrator determines necessary.
“(f)
Sunset.—
This Act shall cease to have effect on the date that is 7 years after the date of enactment of this Act.
“(g)
Rule of Construction.—
Nothing in this Act shall be construed to impair or otherwise affect the authority delegated by law to an executive agency or the head of an executive agency.”

Modernizing Government Technology

Pub. L. 115–91, div. A, title X, subtitle G, Dec. 12, 2017, 131 Stat. 1586, provided that:

“SEC. 1076.
DEFINITIONS.
“In this subtitle:
“(1)
Administrator.—
The term ‘Administrator’ means the Administrator of General Services.
“(2)
Board.—
The term ‘Board’ means the Technology Modernization Board established under section 1094(c)(1).
“(3)
Cloud computing.—
The term ‘cloud computing’ has the meaning given the term by the National Institute of Standards and Technology in NIST Special Publication 800–145 and any amendatory or superseding document thereto.
“(4)
Director.—
The term ‘Director’ means the Director of the Office of Management and Budget.
“(5)
Fund.—
The term ‘Fund’ means the Technology Modernization Fund established under section 1094(b)(1) [probably should be “1078(b)(1)”].
“(6)
Information technology.—
The term ‘information technology’ has the meaning given the term in section 3502 of title 44, United States Code.
“(7)
IT working capital fund.—
The term ‘IT working capital fund’ means an information technology system modernization and working capital fund established under section 1093(b)(1) [probably should be “1077(b)(1)”].
“(8)
Legacy information technology system.—
The term ‘legacy information technology system’ means an outdated or obsolete system of information technology.
“SEC. 1077.
ESTABLISHMENT OF AGENCY INFORMATION TECHNOLOGY SYSTEMS MODERNIZATION AND WORKING CAPITAL FUNDS.
“(a)
Definition.—
In this section, the term ‘covered agency’ means each agency listed in section 901(b) of title 31, United States Code.
“(b)
Information Technology System Modernization and Working Capital Funds.—
“(1)
Establishment.—
The head of a covered agency may establish within the covered agency an information technology system modernization and working capital fund for necessary expenses described in paragraph (3).
“(2)
Source of funds.—
The following amounts may be deposited into an IT working capital fund:
“(A)
Reprogramming and transfer of funds made available in appropriations Acts enacted after the date of enactment of this Act [Dec. 12, 2017], including the transfer of any funds for the operation and maintenance of legacy information technology systems, in compliance with any applicable reprogramming law or guidelines of the Committees on Appropriations of the Senate and the House of Representatives or transfer authority specifically provided in appropriations law.
“(B)
Amounts made available to the IT working capital fund through discretionary appropriations made available after the date of enactment of this Act.
“(3)
Use of funds.—
An IT working capital fund established under paragraph (1) may only be used—
“(A)
to improve, retire, or replace existing information technology systems in the covered agency to enhance cybersecurity and to improve efficiency and effectiveness across the life of a given workload, procured using full and open competition among all commercial items to the greatest extent practicable;
“(B)
to transition legacy information technology systems at the covered agency to commercial cloud computing and other innovative commercial platforms and technologies, including those serving more than 1 covered agency with common requirements;
“(C)
to assist and support covered agency efforts to provide adequate, risk-based, and cost-effective information technology capabilities that address evolving threats to information security;
“(D)
to reimburse funds transferred to the covered agency from the Fund with the approval of the Chief Information Officer, in consultation with the Chief Financial Officer, of the covered agency; and
“(E)
for a program, project, or activity or to increase funds for any program, project, or activity that has not been denied or restricted by Congress.
“(4)
Existing funds.—
An IT working capital fund may not be used to supplant funds provided for the operation and maintenance of any system within an appropriation for the covered agency at the time of establishment of the IT working capital fund.
“(5)
Prioritization of funds.—
The head of each covered agency—
“(A)
shall prioritize funds within the IT working capital fund of the covered agency to be used initially for cost savings activities approved by the Chief Information Officer of the covered agency; and
“(B)
may reprogram and transfer any amounts saved as a direct result of the cost savings activities approved under clause (i) [probably should be “subparagraph (A)”] for deposit into the IT working capital fund of the covered agency, consistent with paragraph (2)(A).
“(6)
Availability of funds.—
“(A)
In general.—
Any funds deposited into an IT working capital fund shall be available for obligation for the 3-year period beginning on the last day of the fiscal year in which the funds were deposited.
“(B)
Transfer of unobligated amounts.—
Any amounts in an IT working capital fund that are unobligated at the end of the 3-year period described in subparagraph (A) shall be transferred to the general fund of the Treasury.
“(7)
Agency cio responsibilities.—
In evaluating projects to be funded by the IT working capital fund of a covered agency, the Chief Information Officer of the covered agency shall consider, to the extent applicable, guidance issued under section 1094(b)(1) [probably should be “1078(b)(1)”] to evaluate applications for funding from the Fund that include factors including a strong business case, technical design, consideration of commercial off-the-shelf products and services, procurement strategy (including adequate use of rapid, iterative software development practices), and program management.
“(c)
Reporting Requirement.—
“(1)
In general.—
Not later than 1 year after the date of enactment of this Act, and every 6 months thereafter, the head of each covered agency shall submit to the Director, with respect to the IT working capital fund of the covered agency—
“(A)
a list of each information technology investment funded, including the estimated cost and completion date for each investment; and
“(B)
a summary by fiscal year of obligations, expenditures, and unused balances.
“(2)
Public availability.—
The Director shall make the information submitted under paragraph (1) publicly available on a website.
“SEC. 1078.
ESTABLISHMENT OF TECHNOLOGY MODERNIZATION FUND AND BOARD.
“(a)
Definition.—
In this section, the term ‘agency’ has the meaning given the term in section 551 of title 5, United States Code.
“(b)
Technology Modernization Fund.—
“(1)
Establishment.—
There is established in the Treasury a Technology Modernization Fund for technology-related activities, to improve information technology, to enhance cybersecurity across the Federal Government, and to be administered in accordance with guidance issued by the Director.
“(2)
Administration of fund.—
The Administrator, in consultation with the Chief Information Officers Council and with the approval of the Director, shall administer the Fund in accordance with this subsection.
“(3)
Use of funds.—
The Administrator shall, in accordance with recommendations from the Board, use amounts in the Fund—
“(A)
to transfer such amounts, to remain available until expended, to the head of an agency for the acquisition of products and services, or the development of such products and services when more efficient and cost effective, to improve, retire, or replace existing Federal information technology systems to enhance cybersecurity and privacy and improve long-term efficiency and effectiveness;
“(B)
to transfer such amounts, to remain available until expended, to the head of an agency for the operation and procurement of information technology products and services, or the development of such products and services when more efficient and cost effective, and acquisition vehicles for use by agencies to improve Governmentwide efficiency and cybersecurity in accordance with the requirements of the agencies;
“(C)
to provide services or work performed in support of—
“(i)
the activities described in subparagraph (A) or (B); and
“(ii)
the Board and the Director in carrying out the responsibilities described in subsection (c)(2); and
“(D)
to fund only programs, projects, or activities or to fund increases for any programs, projects, or activities that have not been denied or restricted by Congress.
“(4)
Authorization of appropriations; credits; availability of funds.—
“(A)
Authorization of appropriations.—
There is authorized to be appropriated to the Fund $250,000,000 for each of fiscal years 2018 and 2019.
“(B)
Credits.—
In addition to any funds otherwise appropriated, the Fund shall be credited with all reimbursements, advances, or refunds or recoveries relating to information technology or services provided for the purposes described in paragraph (3).
“(C)
Availability of funds.—
Amounts deposited, credited, or otherwise made available to the Fund shall be available until expended for the purposes described in paragraph (3).
“(5)
Reimbursement.—
“(A)
Reimbursement by agency.—
“(i)
In general.—
The head of an agency shall reimburse the Fund for any transfer made under subparagraph (A) or (B) of paragraph (3), including any services or work performed in support of the transfer under paragraph (3)(C), in accordance with the terms established in a written agreement described in paragraph (6).
“(ii)
Reimbursement from subsequent appropriations.—
Notwithstanding any other provision of law, an agency may make a reimbursement required under clause (i) from any appropriation made available after the date of enactment of this Act [Dec. 12, 2017] for information technology activities, consistent with any applicable reprogramming law or guidelines of the Committees on Appropriations of the Senate and the House of Representatives.
“(iii)
Recording of obligation.—
Notwithstanding section 1501 of title 31, United States Code, an obligation to make a payment under a written agreement described in paragraph (6) in a fiscal year after the date of enactment of this Act shall be recorded in the fiscal year in which the payment is due.
“(B)
Prices fixed by administrator.—
“(i)
In general.—
The Administrator, in consultation with the Director, shall establish amounts to be paid by an agency under this paragraph and the terms of repayment for activities funded under paragraph (3), including any services or work performed in support of that development under paragraph (3)(C), at levels sufficient to ensure the solvency of the Fund, including operating expenses.
“(ii)
Review and approval.—
Before making any changes to the established amounts and terms of repayment, the Administrator shall conduct a review and obtain approval from the Director.
“(C)
Failure to make timely reimbursement.—
The Administrator may obtain reimbursement from an agency under this paragraph by the issuance of transfer and counterwarrants, or other lawful transfer documents, supported by itemized bills, if payment is not made by the agency during the 90-day period beginning after the expiration of a repayment period described in a written agreement described in paragraph (6).
“(6)
Written agreement.—
“(A)
In general.—
Before the transfer of funds to an agency under subparagraphs (A) and (B) of paragraph (3), the Administrator, in consultation with the Director, and the head of the agency shall enter into a written agreement—
“(i)
documenting the purpose for which the funds will be used and the terms of repayment, which may not exceed 5 years unless approved by the Director; and
“(ii)
which shall be recorded as an obligation as provided in paragraph (5)(A).
“(B)
Requirement for use of incremental funding, commercial products and services, and rapid, iterative development practices.—
The Administrator shall ensure—
“(i)
for any funds transferred to an agency under paragraph (3)(A), in the absence of compelling circumstances documented by the Administrator at the time of transfer, that such funds shall be transferred only on an incremental basis, tied to metric-based development milestones achieved by the agency through the use of rapid, iterative, development processes; and
“(ii)
that the use of commercial products and services are incorporated to the greatest extent practicable in activities funded under subparagraphs (A) and (B) of paragraph (3), and that the written agreement required under paragraph (6) documents this preference.
“(7)
Reporting requirements.—
“(A)
List of projects.—
“(i)
In general.—
Not later than 6 months after the date of enactment of this Act, the Director shall maintain a list of each project funded by the Fund, to be updated not less than quarterly, that includes a description of the project, project status (including any schedule delay and cost overruns), financial expenditure data related to the project, and the extent to which the project is using commercial products and services, including if applicable, a justification of why commercial products and services were not used and the associated development and integration costs of custom development.
“(ii)
Public availability.—
The list required under clause (i) shall be published on a public website in a manner that is, to the greatest extent possible, consistent with applicable law on the protection of classified information, sources, and methods.
“(B)
Comptroller general reports.—
Not later than 2 years after the date of enactment of this Act, and every 2 years thereafter, the Comptroller General of the United States shall submit to Congress and make publically available a report assessing—
“(i)
the costs associated with establishing the Fund and maintaining the oversight structure associated with the Fund compared with the cost savings associated with the projects funded both annually and over the life of the acquired products and services by the Fund;
“(ii)
the reliability of the cost savings estimated by agencies associated with projects funded by the Fund;
“(iii)
whether agencies receiving transfers of funds from the Fund used full and open competition to acquire the custom development of information technology products or services; and
“(iv)
the number of IT procurement, development, and modernization programs, offices, and entities in the Federal Government, including 18F and the United States Digital Services, the roles, responsibilities, and goals of those programs and entities, and the extent to which they duplicate work.
“(c)
Technology Modernization Board.—
“(1)
Establishment.—
There is established a Technology Modernization Board to evaluate proposals submitted by agencies for funding authorized under the Fund.
“(2)
Responsibilities.—
The responsibilities of the Board are—
“(A)
to provide input to the Director for the development of processes for agencies to submit modernization proposals to the Board and to establish the criteria by which those proposals are evaluated, which shall include—
“(i)
addressing the greatest security, privacy, and operational risks;
“(ii)
having the greatest Governmentwide impact; and
“(iii)
having a high probability of success based on factors including a strong business case, technical design, consideration of commercial off-the-shelf products and services, procurement strategy (including adequate use of rapid, agile iterative software development practices), and program management;
“(B)
to make recommendations to the Administrator to assist agencies in the further development and refinement of select submitted modernization proposals, based on an initial evaluation performed with the assistance of the Administrator;
“(C)
to review and prioritize, with the assistance of the Administrator and the Director, modernization proposals based on criteria established pursuant to subparagraph (A);
“(D)
to identify, with the assistance of the Administrator, opportunities to improve or replace multiple information technology systems with a smaller number of information technology services common to multiple agencies;
“(E)
to recommend the funding of modernization projects, in accordance with the uses described in subsection (b)(3), to the Administrator;
“(F)
to monitor, in consultation with the Administrator, progress and performance in executing approved projects and, if necessary, recommend the suspension or termination of funding for projects based on factors including the failure to meet the terms of a written agreement described in subsection (b)(6); and
“(G)
to monitor the operating costs of the Fund.
“(3)
Membership.—
The Board shall consist of 7 voting members.
“(4)
Chair.—
The Chair of the Board shall be the Administrator of the Office of Electronic Government.
“(5)
Permanent members.—
The permanent members of the Board shall be—
“(A)
the Administrator of the Office of Electronic Government; and
“(B)
a senior official from the General Services Administration having technical expertise in information technology development, appointed by the Administrator, with the approval of the Director.
“(6)
Additional members of the board.—
“(A)
Appointment.—
The other members of the Board shall be—
“(i)
1 employee of the National Protection and Programs Directorate [now Cybersecurity and Infrastructure Security Agency] of the Department of Homeland Security, appointed by the Secretary of Homeland Security; and
“(ii)
4 employees of the Federal Government primarily having technical expertise in information technology development, financial management, cybersecurity and privacy, and acquisition, appointed by the Director.
“(B)
Term.—
Each member of the Board described in paragraph (A) shall serve a term of 1 year, which shall be renewable not more than 4 times at the discretion of the appointing Secretary or Director, as applicable.
“(7)
Prohibition on compensation.—
Members of the Board may not receive additional pay, allowances, or benefits by reason of their service on the Board.
“(8)
Staff.—
Upon request of the Chair of the Board, the Director and the Administrator may detail, on a reimbursable or nonreimbursable basis, any employee of the Federal Government to the Board to assist the Board in carrying out the functions of the Board.
“(d)
Responsibilities of Administrator.—
“(1)
In general.—
In addition to the responsibilities described in subsection (b), the Administrator shall support the activities of the Board and provide technical support to, and, with the concurrence of the Director, oversight of, agencies that receive transfers from the Fund.
“(2)
Responsibilities.—
The responsibilities of the Administrator are—
“(A)
to provide direct technical support in the form of personnel services or otherwise to agencies transferred amounts under subsection (b)(3)(A) and for products, services, and acquisition vehicles funded under subsection (b)(3)(B);
“(B)
to assist the Board with the evaluation, prioritization, and development of agency modernization proposals.
“(C)
to perform regular project oversight and monitoring of approved agency modernization projects, in consultation with the Board and the Director, to increase the likelihood of successful implementation and reduce waste; and
“(D)
to provide the Director with information necessary to meet the requirements of subsection (b)(7).
“(e)
Effective Date.—
This section shall take effect on the date that is 90 days after the date of enactment of this Act.
“(f)
Sunset.—
“(1)
In general.—
On and after the date that is 2 years after the date on which the Comptroller General of the United States issues the third report required under subsection (b)(7)(B), the Administrator may not award or transfer funds from the Fund for any project that is not already in progress as of such date.
“(2)
Transfer of unobligated amounts.—
Not later than 90 days after the date on which all projects that received an award from the Fund are completed, any amounts in the Fund shall be transferred to the general fund of the Treasury and shall be used for deficit reduction.
“(3)
Termination of technology modernization board.—
Not later than 90 days after the date on which all projects that received an award from the Fund are completed, the Technology Modernization Board and all the authorities of subsection (c) shall terminate.”

Executive Documents
Ex. Ord. No. 13960. Promoting the Use of Trustworthy Artificial Intelligence in the Federal Government

Ex. Ord. No. 13960, Dec. 3, 2020, 85 F.R. 78939, provided:

By the authority vested in me as President by the Constitution and the laws of the United States of America, it is hereby ordered as follows:

Section 1. Purpose. Artificial intelligence (AI) promises to drive the growth of the United States economy and improve the quality of life of all Americans. In alignment with Executive Order 13859 of February 11, 2019 (Maintaining American Leadership in Artificial Intelligence) [42 U.S.C. 6601 note], executive departments and agencies (agencies) have recognized the power of AI to improve their operations, processes, and procedures; meet strategic goals; reduce costs; enhance oversight of the use of taxpayer funds; increase efficiency and mission effectiveness; improve quality of services; improve safety; train workforces; and support decision making by the Federal workforce, among other positive developments. Given the broad applicability of AI, nearly every agency and those served by those agencies can benefit from the appropriate use of AI.

Agencies are already leading the way in the use of AI by applying it to accelerate regulatory reform; review Federal solicitations for regulatory compliance; combat fraud, waste, and abuse committed against taxpayers; identify information security threats and assess trends in related illicit activities; enhance the security and interoperability of Federal Government information systems; facilitate review of large datasets; streamline processes for grant applications; model weather patterns; facilitate predictive maintenance; and much more.

Agencies are encouraged to continue to use AI, when appropriate, to benefit the American people. The ongoing adoption and acceptance of AI will depend significantly on public trust. Agencies must therefore design, develop, acquire, and use AI in a manner that fosters public trust and confidence while protecting privacy, civil rights, civil liberties, and American values, consistent with applicable law and the goals of Executive Order 13859.

Certain agencies have already adopted guidelines and principles for the use of AI for national security or defense purposes, such as the Department of Defense’s Ethical Principles for Artificial Intelligence (February 24, 2020), and the Office of the Director of National Intelligence’s Principles of Artificial Intelligence Ethics for the Intelligence Community (July 23, 2020) and its Artificial Intelligence Ethics Framework for the Intelligence Community (July 23, 2020). Such guidelines and principles ensure that the use of AI in those contexts will benefit the American people and be worthy of their trust.

Section 3 of this order establishes additional principles (Principles) for the use of AI in the Federal Government for purposes other than national security and defense, to similarly ensure that such uses are consistent with our Nation’s values and are beneficial to the public. This order further establishes a process for implementing these Principles through common policy guidance across agencies.

Sec. 2. Policy. (a) It is the policy of the United States to promote the innovation and use of AI, where appropriate, to improve Government operations and services in a manner that fosters public trust, builds confidence in AI, protects our Nation’s values, and remains consistent with all applicable laws, including those related to privacy, civil rights, and civil liberties.

(b) It is the policy of the United States that responsible agencies, as defined in section 8 of this order, shall, when considering the design, development, acquisition, and use of AI in Government, be guided by the common set of Principles set forth in section 3 of this order, which are designed to foster public trust and confidence in the use of AI, protect our Nation’s values, and ensure that the use of AI remains consistent with all applicable laws, including those related to privacy, civil rights, and civil liberties.

(c) It is the policy of the United States that the Principles for the use of AI in Government shall be governed by common policy guidance issued by the Office of Management and Budget (OMB) as outlined in section 4 of this order, consistent with applicable law.

Sec. 3. Principles for Use of AI in Government. When designing, developing, acquiring, and using AI in the Federal Government, agencies shall adhere to the following Principles:

(a) Lawful and respectful of our Nation’s values. Agencies shall design, develop, acquire, and use AI in a manner that exhibits due respect for our Nation’s values and is consistent with the Constitution and all other applicable laws and policies, including those addressing privacy, civil rights, and civil liberties.

(b) Purposeful and performance-driven. Agencies shall seek opportunities for designing, developing, acquiring, and using AI, where the benefits of doing so significantly outweigh the risks, and the risks can be assessed and managed.

(c) Accurate, reliable, and effective. Agencies shall ensure that their application of AI is consistent with the use cases for which that AI was trained, and such use is accurate, reliable, and effective.

(d) Safe, secure, and resilient. Agencies shall ensure the safety, security, and resiliency of their AI applications, including resilience when confronted with systematic vulnerabilities, adversarial manipulation, and other malicious exploitation.

(e) Understandable. Agencies shall ensure that the operations and outcomes of their AI applications are sufficiently understandable by subject matter experts, users, and others, as appropriate.

(f) Responsible and traceable. Agencies shall ensure that human roles and responsibilities are clearly defined, understood, and appropriately assigned for the design, development, acquisition, and use of AI. Agencies shall ensure that AI is used in a manner consistent with these Principles and the purposes for which each use of AI is intended. The design, development, acquisition, and use of AI, as well as relevant inputs and outputs of particular AI applications, should be well documented and traceable, as appropriate and to the extent practicable.

(g) Regularly monitored. Agencies shall ensure that their AI applications are regularly tested against these Principles. Mechanisms should be maintained to supersede, disengage, or deactivate existing applications of AI that demonstrate performance or outcomes that are inconsistent with their intended use or this order.

(h) Transparent. Agencies shall be transparent in disclosing relevant information regarding their use of AI to appropriate stakeholders, including the Congress and the public, to the extent practicable and in accordance with applicable laws and policies, including with respect to the protection of privacy and of sensitive law enforcement, national security, and other protected information.

(i) Accountable. Agencies shall be accountable for implementing and enforcing appropriate safeguards for the proper use and functioning of their applications of AI, and shall monitor, audit, and document compliance with those safeguards. Agencies shall provide appropriate training to all agency personnel responsible for the design, development, acquisition, and use of AI.

Sec. 4. Implementation of Principles. (a) Existing OMB policies currently address many aspects of information and information technology design, development, acquisition, and use that apply, but are not unique, to AI. To the extent they are consistent with the Principles set forth in this order and applicable law, these existing policies shall continue to apply to relevant aspects of AI use in Government.

(b) Within 180 days of the date of this order [Dec. 3, 2020], the Director of OMB (Director), in coordination with key stakeholders identified by the Director, shall publicly post a roadmap for the policy guidance that OMB intends to create or revise to better support the use of AI, consistent with this order. This roadmap shall include, where appropriate, a schedule for engaging with the public and timelines for finalizing relevant policy guidance. In addressing novel aspects of the use of AI in Government, OMB shall consider updates to the breadth of its policy guidance, including OMB Circulars and Management Memoranda.

(c) Agencies shall continue to use voluntary consensus standards developed with industry participation, where available, when such use would not be inconsistent with applicable law or otherwise impracticable. Such standards shall also be taken into consideration by OMB when revising or developing AI guidance.

Sec. 5. Agency Inventory of AI Use Cases. (a) Within 60 days of the date of this order, the Federal Chief Information Officers Council (CIO Council), in coordination with other interagency bodies as it deems appropriate, shall identify, provide guidance on, and make publicly available the criteria, format, and mechanisms for agency inventories of non-classified and non-sensitive use cases of AI by agencies.

(b) Within 180 days of the CIO Council’s completion of the directive in section 5(a) of this order, and annually thereafter, each agency shall prepare an inventory of its non-classified and non-sensitive use cases of AI, within the scope defined by section 9 of this order, including current and planned uses, consistent with the agency’s mission.

(c) As part of their respective inventories of AI use cases, agencies shall identify, review, and assess existing AI deployed and operating in support of agency missions for any inconsistencies with this order.

(i) Within 120 days of completing their respective inventories, agencies shall develop plans either to achieve consistency with this order for each AI application or to retire AI applications found to be developed or used in a manner that is not consistent with this order. These plans must be approved by the agency-designated responsible official(s), as described in section 8 of this order, within this same 120-day time period.

(ii) In coordination with the Agency Data Governance Body and relevant officials from agencies not represented within that body, agencies shall strive to implement the approved plans within 180 days of plan approval, subject to existing resource levels.

(d) Within 60 days of the completion of their respective inventories of use cases of AI, agencies shall share their inventories with other agencies, to the extent practicable and consistent with applicable law and policy, including those concerning protection of privacy and of sensitive law enforcement, national security, and other protected information. This sharing shall be coordinated through the CIO and Chief Data Officer Councils, as well as other interagency bodies, as appropriate, to improve interagency coordination and information sharing for common use cases.

(e) Within 120 days of the completion of their inventories, agencies shall make their inventories available to the public, to the extent practicable and in accordance with applicable law and policy, including those concerning the protection of privacy and of sensitive law enforcement, national security, and other protected information.

Sec. 6. Interagency Coordination. Agencies are expected to participate in interagency bodies for the purpose of advancing the implementation of the Principles and the use of AI consistent with this order. Within 45 days of this order, the CIO Council shall publish a list of recommended interagency bodies and forums in which agencies may elect to participate, as appropriate and consistent with their respective authorities and missions.

Sec. 7. AI Implementation Expertise. (a) Within 90 days of the date of this order, the Presidential Innovation Fellows (PIF) program, administered by the General Services Administration (GSA) in collaboration with other agencies, shall identify priority areas of expertise and establish an AI track to attract experts from industry and academia to undertake a period of work at an agency. These PIF experts will work within agencies to further the design, development, acquisition, and use of AI in Government, consistent with this order.

(b) Within 45 days of the date of this order, the Office of Personnel Management (OPM), in coordination with GSA and relevant agencies, shall create an inventory of Federal Government rotational programs and determine how these programs can be used to expand the number of employees with AI expertise at the agencies.

(c) Within 180 days of the creation of the inventory of Government rotational programs described in section 7(b) of this order, OPM shall issue a report with recommendations for how the programs in the inventory can be best used to expand the number of employees with AI expertise at the agencies. This report shall be shared with the interagency coordination bodies identified pursuant to section 6 of this order, enabling agencies to better use these programs for the use of AI, consistent with this order.

Sec. 8. Responsible Agencies and Officials. (a) For purposes of this order, the term “agency” refers to all agencies described in section 3502, subsection (1), of title 44, United States Code, except for the agencies described in section 3502, subsection (5), of title 44.

(b) This order applies to agencies that have use cases for AI that fall within the scope defined in section 9 of this order, and excludes the Department of Defense and those agencies and agency components with functions that lie wholly within the Intelligence Community. The term “Intelligence Community” has the meaning given the term in section 3003 of title 50, United States Code.

(c) Within 30 days of the date of this order, each agency shall specify the responsible official(s) at that agency who will coordinate implementation of the Principles set forth in section 3 of this order with the Agency Data Governance Body and other relevant officials and will collaborate with the interagency coordination bodies identified pursuant to section 6 of this order.

Sec. 9. Scope of Application. (a) This order uses the definition of AI set forth in section 238(g) of the [John S. McCain] National Defense Authorization Act for Fiscal Year 2019 [Pub. L. 115–232, 10 U.S.C. 2358 note] as a reference point. As Federal Government use of AI matures and evolves, OMB guidance developed or revised pursuant to section 4 of this order shall include such definitions as are necessary to ensure the application of the Principles in this order to appropriate use cases.

(b) Except for the exclusions set forth in section 9(d) of this order, or provided for by applicable law, the Principles and implementation guidance in this order shall apply to AI designed, developed, acquired, or used specifically to advance the execution of agencies’ missions, enhance decision making, or provide the public with a specified benefit.

(c) This order applies to both existing and new uses of AI; both stand-alone AI and AI embedded within other systems or applications; AI developed both by the agency or by third parties on behalf of agencies for the fulfilment of specific agency missions, including relevant data inputs used to train AI and outputs used in support of decision making; and agencies’ procurement of AI applications.

(d) This order does not apply to:

(i) AI used in defense or national security systems (as defined in 44 U.S.C. 3552(b)(6) or as determined by the agency), in whole or in part, although agencies shall adhere to other applicable guidelines and principles for defense and national security purposes, such as those adopted by the Department of Defense and the Office of the Director of National Intelligence;

(ii) AI embedded within common commercial products, such as word processors or map navigation systems, while noting that Government use of such products must nevertheless comply with applicable law and policy to assure the protection of safety, security, privacy, civil rights, civil liberties, and American values; and

(iii) AI research and development (R&D) activities, although the Principles and OMB implementation guidance should inform any R&D directed at potential future applications of AI in the Federal Government.

Sec. 10. General Provisions. (a) Nothing in this order shall be construed to impair or otherwise affect:

(i) the authority granted by law to an executive department or agency, or the head thereof; or

(ii) the functions of the Director relating to budgetary, administrative, or legislative proposals.

(b) This order shall be implemented consistent with applicable law and subject to the availability of appropriations.

(c) This order is not intended to, and does not, create any right or benefit, substantive or procedural, enforceable at law or in equity by any party against the United States, its departments, agencies, or entities, its officers, employees, or agents, or any other person.

Donald J. Trump.