Recently, Large-scale Language Models (LLMs) such as Chat Generative Pre-trained Transformer (ChatGPT) and Generative Pre-trained Transformer 4 (GPT-4) have demonstrated remarkable performance in the general domain. However, Inadaptability in a particular domain have led to hallucination for these LLMs when responding in specific domain contexts. The issue has attracted widespread attention, existing domain-centered fine-tuning efforts have predominantly focused on sectors like medical, financial, and legal, leaving critical areas such as power energy relatively unexplored. To bridge this gap, this paper introduces a novel power energy chat model called PowerPulse. Built upon the open and efficient foundation language models (LLaMA) architecture, PowerPulse is fine-tuned specifically on Chinese Power Sector Domain Knowledge. This work marks the inaugural application of the LLaMA model in the field of power energy. By leveraging pertinent pre-training data and instruction fine-tuning datasets tailored for the power energy domain, the PowerPulse model showcases exceptional performance in tasks such as text generation, summary extraction, and topic classification. Experimental results validate the efficacy of the PowerPulse model, making significant contributions to the advancement of specialized language models in specific domains.