142 Computer Technician jobs in Overgaard, AZ

      Recommended
      Apply Directly
      Information Systems Technician

      U.S. Navy

      Taylor, AZ 85939

      Education AssistanceHealth InsuranceRetirement Benefit
      New, Posted 1 day ago
      Easy Apply
      Counter Sales Representative

      Arizona Building Supply

      Heber, AZ 85928

      Onsite

      • High School Diploma or GED required.
      • Must be able to multi-task.
      • Must be able to drive and walk throughout yards, plants and offices
      • Must be detail oriented and highly observant.
      • Must possess good mathematic skills. Able to use a calculator and computers.
      • Computer literate with Microsoft Office products and Trend system
      SmartExplore AI is experimental.
      ATCI-4990089-S1865430

      Accenture

      Snowflake, AZ 85937

      ~ 38 min Onsite

      • This position is based at Bengaluru location.
      • Experience with data architecture frameworks and methodologies (e.g., TOGAF)
      • Understanding event-driven design patterns and practices. Experience with message brokers like RabbitMQ or ActiveMQ
      • NoSQL Databases: Experience with at least one of the NoSQL databases like MongoDB, Cassandra, and Couchbase for handling unstructured data
      • A minimum of 15-18 years of progressive information technology experience is required
      • Query Languages: Experience with at least one of the query languages like Cypher (Neo4j), SPARQL (RDF), or Gremlin (Apache TinkerPop). Familiarity with ontologies, RDF Schema, and OWL (Web Ontology Language)
      • 15 years full time education
      • Exposure to semantic web technologies and standards
      • Proficiency with Business Intelligence (BI) tools such as Tableau, Power BI, and QlikView
      • Exposure to machine learning frameworks such as TensorFlow, PyTorch, or Keras
      • Data Warehousing and BI Tools: Expertise in data warehousing solutions (e.g., Snowflake, Amazon Redshift, Google BigQuery)
      • Database Technologies: Relational Databases: Expertise in SQL databases such as MySQL, PostgreSQL, Oracle, and Microsoft SQL Server
      • AWS Certified Data Engineer Associate / Microsoft Certified: Azure Data Engineer Associate / Google Cloud Certified Professional Data Engineer certification is mandatory
      • Data Modeling and Architecture: Proficiency in data modeling techniques (conceptual, logical, and physical models)
      • Cloud Platforms: Experience with cloud data services from at least one of the providers like AWS (Amazon Redshift, AWS Glue), Microsoft Azure (Azure SQL Database, Azure Data Factory), and Google Cloud Platform (BigQuery, Dataflow)
      • A 15 years full-time education is required
      • IoT and Industrial Data Systems: Familiarity with Industrial Internet of Things (IIoT) platforms and protocols (e.g., MQTT, OPC UA). Experience with IoT data platforms like AWS IoT, Azure IoT Hub, and Google Cloud IoT Core
      • Industrial Data Architect will be responsible for developing and overseeing the industrial data architecture strategies to support advanced data analytics, business intelligence, and machine learning initiatives
      • Minimum 12 year(s) of experience is required
      • Graph Databases: Proficiency with at least one of the graph databases such as Neo4j, Amazon Neptune, or ArangoDB
      • Understanding distributed computing and data processing frameworks
      • Knowledge of database design principles and normalization
      • Exposure to edge computing platforms like AWS IoT Greengrass or Azure IoT Edge
      • Big Data Technologies: Experience with big data platforms and tools such as Hadoop, Spark, and Apache Kafka
      • Experience working with one or more of Streaming data platforms like Apache Kafka, Amazon Kinesis, Apache Flink
      • Ability to design and implement real-time data pipelines. Familiarity with processing frameworks such as Apache Storm, Spark Streaming, or Google Cloud Dataflow
      • Data Governance and Security: Understanding data governance principles, data quality management, and metadata management. Knowledge of data security best practices, compliance standards (e.g., GDPR, HIPAA), and data masking techniques
      • Data Integration and ETL (Extract, Transform, Load): Proficiency in ETL tools and processes (e.g., Talend, Informatica, Apache NiFi). Experience with data integration tools and techniques to consolidate data from various sources
      • AI/ML, GenAI: Experience working on data readiness for feeding into AI/ML/GenAI applications
      SmartExplore AI is experimental.
      New, Posted 12 hours ago
      Apply Directly
      Housekeeping Supervisor PM

      Enchantment Resort

      Sedona, AZ 86336

      High school diploma or GED

      Driver License

      ~ 2 hr 34 min Onsite

      • 90% walking, standing and bending
      • Ability to drive golf cart
      • Lifting/Carrying up to 50 lbs.
      • Ability to direct staff in a positive manner.
      • High school graduate or equivalent.
      • 10% Sitting
      • Use of cleaning solutions
      • Prior housekeeping supervisory experience required.
      • Must be detail oriented, strong verbal and written communication skills.
      • English skills necessary to communicate with guests.
      • Ability to work in all types of weather conditions
      • Distance vision 1-3 feet
      • Hearing and Manual dexterity
      SmartExplore AI is experimental.
      New, Posted 1 day ago
      Recommended
      Senior Software Engineer

      AI Technology Insights

      Snowflake, AZ 85937

      Paid Relocation to Snowflake, AZ

      Remote

      • Cloud Data Platforms – Hands-on experience with Azure Data Lake, AWS Redshift, Google BigQuery, or other cloud-based data solutions
      • Problem-Solving & Business Acumen – Ability to translate complex data into actionable insights and align solutions with business goals
      • BI & Visualization Tools – Proficiency in Tableau, Power BI, or similar tools for data visualization and reporting
      • Collaboration & Communication – Strong interpersonal skills with a positive, “can-do” attitude to work effectively with cross-functional teams
      • Skills/Experience
      • Big Data & Streaming – Familiarity with Kafka, Snowflake, Hadoop, or Databricks for handling large-scale data processing
      • Database Expertise – Strong experience with SQL and NoSQL databases for querying, data modeling, and performance optimization
      • Senior Software Engineer
      • Responsibilities
      • ETL & Data Engineering – Expertise in Azure Data Factory (ADF), Apache Spark, Airflow, or other ETL tools to design scalable data pipelines
      • Programming for Data Processing – Proficiency in Python (Pandas, NumPy, PySpark), R, or similar for data manipulation and analysis
      SmartExplore AI is experimental.

      Recent Searches

        Browse Jobs in Top Cities

        Browse Jobs by State

        Browse Jobs by Title

        Post a Job

        About

        Advice

        Contact

        © 2026 Jobs2Careers. All rights reserved.

        Privacy Policy

        Terms of Use

        Your Privacy ChoicesCalifornia Consumer Privacy Act (CCPA) Opt-Out Icon

        Logos provided by Logo.dev

        Jobs2Careers Powered by Talroo

        Privacy PolicyTerms of Use, and Your Privacy Choices